## Longest subarray without repeated numbers

Given an array, find the length of the longest subarray which has no repeating numbers.

```Input:
A = {1,2,3,3,4,5}
Output:
3
Explanation:
Longest subarray without any repeating elements are {1,2,3} & {3,4,5}.
```

## Thought Process

1. Brute Force

In this approach, you will consider each subarray and then check whether that particular subarray contains no repeating elements or not. Right? So let’s find out how expensive this task would be? The complexity of generating subarray of an array would be O(n2), and when I will generate a particular subarray then I have to check whether the subarray contains unique elements(will use a map for this) or not, which will take O(n) time.

Time Complexity : O(n^3)
Space Complexity : O(n)

2. Sliding Window Solution

Sliding Window Technique is a method for finding subarrays in an array that satisfy given conditions. If you’re new to Sliding Window technique, visit this Sliding Window, it will be helpful for you. In Short words, I can say that maintaining a subset of items as our window, and resizing and moving that window until we find a solution.

We will use two pointers to construct our window. To expand the window, we have to increment the right pointer and to shrink the window we have to increment the left pointer.

We will increment the right pointer by 1 until the property of the subarray is not violated if the property gets violated, we have to shrink the window(means increment the left pointer by 1) till the subarray contains the repeating letters. We have to repeat this step until the value of the right pointer <= array Size.

• Here, Property means the subarray should contain only the unique elements.

Initially, left and right pointers both are at 0th index.

1. Add the value of arr[right] into the map.
• After adding the value, if map[arr[right]] is 1, that is ok.
Increment the right pointer by 1.
Update the maxSize of the window variable.
• After adding the value into map, if map[arr[right]] is 2, it means that our window contains repeating elements. Now, we have to shrink the window from left so that our window always contains the unique elements.
Increment the left pointer and parallely decrement the value of the map[arr[left]]by 1.
Do this thing until the value of the map[arr[right]]becomes 1.
2. Repeat the 1st step, till the right < arraySize

Time Complexity – O(n), where n is the number of integers in the array.
Space Complexity – O(k), where k is the number of distinct integers in the input array. This also means k<=n, because in worst case, the whole array might not have any repeating integers so the entire array will be addedÂ to the map.

### Implementation

```#include<bits/stdc++.h>;
using namespace std;
int lengthOfLongestSubarray(vector<int>v)
{
if(v.size()==0)
{
return 0;
}
map<int,int> mapy;
int left=0, right=0;
int max_window=-1;

for(right=0;right<v.size();right++)
{
int num=v[right];
mapy[num]=mapy[num]+1;

while(left < right && mapy[num] > 1){
mapy[v[left]]=mapy[v[left]]-1;
left=left+1;
}
//calculating max_length of window
max_window=max(max_window,right-left+1);
}
return max_window;
}
main()
{
vector<int>v={1,2,3,3,4,5};
cout<<lengthOfLongestSubarray(v);
}```

This post is contributed by Monika Bhasin.

## Ship capacity problem

There are some problems which do not appear to be a binary search problem at first. For example, ship capacity problem on leetcode. Problem statement is:
A conveyor belt has packages that must be shipped from one port to another within D days.
The i-th package on the conveyor belt has a weight of weights[i]. Each day, we load the ship with packages on the conveyor belt (in the order given by weights). We may not load more weight than the maximum weight capacity of the ship.
Return the least weight capacity of the ship that will result in all the packages on the conveyor belt being shipped within D days.

For example:

```Input: weights = [1,2,3,4,5,6,7,8,9,10], D = 5
Output: 15
Explanation:
A ship capacity of 15 is the minimum to ship all the packages in 5 days like this:
1st day: 1, 2, 3, 4, 5
2nd day: 6, 7
3rd day: 8
4th day: 9
5th day: 10

Note that the cargo must be shipped in the order given, so using a ship of capacity 14 and splitting the packages into parts like (2, 3, 4, 5), (1, 6, 7), (8), (9), (10) is not allowed.
```

## Ship capacity problem and binary search algorithm

At first glance, it does not appear to be a binary search problem. Where is the input we will search on? That’s where we have to remember that to apply a binary search, we actually do not need an input set, all we need is lower and upper bound.

Hint 1
What is the minimum capacity you would need to ship this cargo? You can choose the lowest possible capacity if you infinite number of days to ship all weights.

Hint 2
What if you have only one day to ship all the weights? In that case, what will be the capacity of the ship? Do you need more than capacity ever?

To find these bounds, try to vary the constraint to extremes. In the example of ship capacity, try to put the number of days constraints to an extreme. What if we had an infinite number of days to ship the cargo?

In that case, we will need a ship that has at least capacity greater than equal to the heaviest weight cargo, as no matter how many days, we cannot move the cargo.

Again, what if I had only 1 day to ship the whole cargo? In that case, we need a ship that can take all the weights in 1 day, so ship capacity should be the sum of all the weights.

Now, we know the lower and upper bound of the ship, all we have to adjust the capacity and see if we can ship cargo in D days? Start with mid, see if we can. If yes, then try a smaller capacity. If not, then try greater capacity. All we are doing is find the first capacity between lower and upper bounds. It seems like the first occurrence problem now.

Understood the concept, you still need help with implementation? Please see the code below.

### Ship capacity problem implementation

```    public int shipWithinDays(int[] weights, int D) {

int upperLimit = 0;
int lowerLimit = 0;

for(int i = 0; i<weights.length; i++){
upperLimit+= weights[i];
}
//Not returning from while loop :)
while(lowerLimit < upperLimit){
int shipCapacity = lowerLimit + (upperLimit - lowerLimit)/2;

if(isItPossible(D, shipCapacity, weights)){
upperLimit = shipCapacity;
}
else{
lowerLimit = shipCapacity + 1;
}
}

return lowerLimit;

}

private boolean isItPossible(int D, int shipCapacity, int[] weights){

int days = 1;
int i = 0;

while(i<weights.length){
if(weights[i] > shipCapacity) return false;

days++; i++;
}
days++;
}
else{
i++;
}
}

return days <= D;
}
```

The time complexity of the solution is O(nlogc) where n is number of weights and c is capacity range.

Please share if you find anything wrong or missing. If you are looking for coaching to prepare for your technical interviews, please book a free session with us.

# Design a data structure with insert delete and getRandom in O(1)

The problem statement is to design a data structure which performs the following operations in O(1) time complexity:
1. Insert an element, `insert(int value)`
2. Remove an element, `remove(int value)`
3. Get random element, `getRandom()`

For example, insert 1 into the data structure insert(1): [1] insert 2 into the data structure insert(2): [1,2] insert 3 into the data structure insert(3): [1,2,3]

Remove 2 from it, remove(2). [1,3] getRandom() should return 1 and 3 with equal probabilities.

These kind of problems are easy and hard at the same time. Idea is to go step by step and solve each part. The first step is to define an interface for this data structure, which is easy given the definition of the problem.

```public interface IRandomNumberGenerator {
public boolean insert(int value);
public boolean remove (int value);
public int getRandom();
}
```

Now that interface is ready, time to start implementing the class which implements this interface. First of all, we have to find a container to store all the elements. If we take an ArrayList, `insert()` is O(1) as we will always add new element at the end of the ArrayList. `getRandom` is also O(1). However, there is problem with `remove()`. To remove an element from ArrayList, we have to scan the whole ArrayList and remove the element, the move all the elements on the right of the deleted element to one index left. This is O(n) operation.

## Insert delete and getRandom in O(1): selection of data structures

A problem with storing elements in an ArrayList is that while removal, we have to scan the list and find the location of the element to be removed. What if we already knew the location of the element? If we store the position of each element in ArrayList in a HashMap which maps the value to its index on ArrayList

Now, `insert()` has to insert a value to two data structures, first into the ArrayList and then the location of the value in ArrayList to the HashMap. Remove operation can simply go to the location in the ArrayList and delete the element. Wait, still, we have to move all the elements on the right one position left. It means the worst case complexity of `remove()` still O(n).

We know one thing: if I remove the last element from the ArrayList then there is no shifting required. What if we copy the last value at the index of the element to be removed and then just remove the last element. Be careful, we have to update the HashMap with the new value for the element at the last index of ArrayList. In this way, `remove()` is also O(1).

### Insert, delete and getRandom in O(1): implementation

```package AlgorithmsAndMe;

import java.util.ArrayList;
import java.util.HashMap;
import java.util.Map;
import java.util.Random;

public class RandomNumberGenerator implements IRandomNumberGenerator {

private ArrayList<Integer> list;
private Map<Integer, Integer> loc;
private Random random;

//Initializing the class
public RandomNumberGenerator(){
list = new ArrayList<>();
loc = new HashMap<>();
random = new Random();
}

@Override
public boolean insert(int value) {
/*If hash already contains key then it is a duplicate key.
So, we just return false.
*/
if(loc.containsKey(value)) return false;

//Insert into list

//Save the location on hash map
loc.put(value, list.size()-1);
return true;
}

@Override
public boolean remove(int value) {
/* If there is no entry in hash, that means
there is no element in ArrayList */
if(!loc.containsKey(val)) return false;

int location = loc.get(val);
//Remove from hash
loc.remove(val);

if(location != list.size()-1){
/*Copy the last value in the array
list to the current location*/
list.set(location, list.get(list.size()-1));

//Update the location of last element in hash
loc.put(list.get(location), location);
}

//remove the last location from ArrayList
list.remove(list.size()-1);

return true;
}

@Override
public int getRandom() {
return list.get(random.nextInt(list.size()));
}
}

```
```package AlgorithmsAndMe;

import static org.junit.Assert.*;

public class RandomNumberGeneratorTest {

RandomNumberGenerator randomNumberGenerator =
new RandomNumberGenerator();

@org.junit.Test
public void testInterface() {
assertEquals(true, randomNumberGenerator.insert(4));
assertEquals(true, randomNumberGenerator.insert(5));
assertEquals(true, randomNumberGenerator.insert(3));
assertEquals(true, randomNumberGenerator.insert(2));

assertEquals(true, randomNumberGenerator.remove(4));

int random = randomNumberGenerator.getRandom();
System.out.println(random);
}
}
```

The complexity of the whole data structure for insert, delete and getRandom is O(1).

#### Insert, delete and get random when duplicates are allowed

Let’s make this problem a bit more complex by making duplicate elements possible in the list. The first problem with the existing implementation is that it stores the location of an element in ArrayList in a HashMap. If the same element can appear multiple times in the list, then which location should we store? We should store all the locations. It will change the definition of our HashMap as

```Map<Integer, HashSet<Integer>>
```

Hashset implements the Set interface, backed by a hash table which is actually a HashMap instance. No guarantee is made as to the iteration order of the set which means that the class does not guarantee the constant order of elements over time, that is what we require. We require that insert and remove operation on this data structure should be O(1) or constant time complexity.
To know more about the complexity of various data structures in Java, follow Runtime Complexity of Java Collections and read reason why HashSet provides constant time insert and remove operations.
Everything else follows the same process. To `insert()`, we should insert the location of the element at the HashSet in the hash table. While removing we find the last location of the element, put the last element of ArrayList in that location and update the HashSet of the location corresponding to the value at the last index of the ArrayList. Remove the last element from ArrayList.
We also have to move the last element in ArrayList of location in Hash, which is O(1) operation.

`getRandom()` implementation remains same.

```package AlgorithmsAndMe;

import java.util.*;

public class RandomNumberGenerator implements IRandomNumberGenerator {

private ArrayList<Integer> list;
private Map<Integer, HashSet<Integer>> loc;
private Random random;

//Initializing the class
public RandomNumberGenerator(){
list = new ArrayList<>();
loc = new HashMap<>();
random = new Random();
}

@Override
public boolean insert(int value) {

if(!loc.containsKey(value)){
loc.put(value, new HashSet<>());
};

//Insert into list

//Save the location on hash map
return true;
}

@Override
public boolean remove(int value) {
/* If there is no entry in hash, that means
there is no element in ArrayList */
if(!loc.containsKey(value)) return false;

//Get the last location of the element in ArrayList
HashSet<Integer> listLocations = loc.get(value);
int location = listLocations.iterator().next();
loc.get(value).remove(location);

int lastElement = list.get(list.size()-1);
if( lastElement != value) {
/*Copy the last value in the array
list to the current location*/
list.set(location, lastElement);
//Update the location of last element in hash
loc.get(lastElement).remove(list.size()-1);
}
//remove the last location from ArrayList
list.remove(list.size()-1);

if(listLocations.isEmpty()) loc.remove(value);
return true;
}

@Override
public int getRandom() {
return list.get(random.nextInt(list.size()));
}
}

```

Other problems which are very similar to this concept are: design an LRU cache, first non-repeated character in stream etc.

Please share if there is anything wrong or missing. If you are preparing for an interview and need one to one personalized coaching, please reach out to us on [email protected]