# Array left and right sums

Let’s consider the following problem:

*You are given an array of N integers. Determine whether there’s an element x in the array such that the sum of the elements to the left of x equal the sum of the elements to its right.*

The first solution that comes to mind is to iterate on the array, picking the item at index **i**, calculating the sum **A[0, i - 1]**, the sum **A[i + 1, N - 1]** and checking their equality. This means that for every element of the array you need to do a pass of the whole array, which is **N * (N - 1)** operations, this resulting in a **O( N ^{2} )** time complexity.

Can we do better? Let’s say we try to limit the number of passes of the array and use some extra variables to keep track of the values that we need to compare. We do one pass to calculate **rigthSum**, which at the beginning will contain the sum of all the values of the array. We initialize another variable, **leftSum**, to zero.

We then iterate on the array, starting with the element at index 0. We check whether **leftSum == (rightSum - A[0])**: if yes, we have found our solution, otherwise we add **A[0]** to **leftSum**. At the next step the equality check becomes the following: **leftSum == (rightSum - A[1] - A[0])**. But **A[0]** is the current value of **leftSum**, so we can rewrite that as: **leftSum == (rightSum - A[1] - leftSum)**.

In general, iterating on the array with the index variable **i**, we will need to check the following:

`leftSum == rightSum - leftSum - A[i]`

Whenever we get a true from the preceding comparison, we know that the element at index **i** is our solution. This algorithm is definitely smarter than the obvious one: it has **O( N )** time complexity and **O( 1 )** space complexity.

Feel free to tweet me if you have suggestions on how to improve this post.