> Uploading knowledge... _
[░░░░░░░░░░░░░░░░░░░░░░░░] 0%
blog logo
CHICIO CODINGPixels. Code. Unplugged.

Arrays

Arrays are one of the most fundamental data structures in computer science. They provide a way to store elements in a contiguous block of memory, allowing efficient access and manipulation.

What is an Array?

An array is a fundamental data structure that stores a collection of elements of the same type in contiguous memory locations. Each element in the array can be accessed directly using its index, which makes operations like retrieval and traversal very efficient. Arrays are widely used because they provide a simple and systematic way to organize data in memory.

Arrays are important in computer science and programming because they form the foundation for more advanced data structures such as matrices, heaps, hash tables, and even dynamic arrays. They allow developers to efficiently store and process large amounts of data, making them essential for algorithms in searching, sorting, and optimization.

Memory Representation

At the memory level, arrays are stored in consecutive memory addresses. The index of an element acts as an offset from the base address of the array. For example, if the base address of the array is A[0]A[0], and each element at position ii where 0<i<Length(A)0 < i < \text{Length}(A) has a size Size(A[i])\text{Size}(A[i]), then the address of A[i]A[i] can be calculated as:

Address(A[i])=BaseAddress(A[0])+i×Size(A[i])\text{Address}(A[i]) = \text{BaseAddress}(A[0]) + i \times \text{Size}(A[i])

This direct mapping ensures constant-time access O(1)O(1) to any element in the array.

Operations and Time Complexity

The time complexity of insertion and deletion operations in arrays depends on where the operation occurs and the array type:

Insert/Delete at the end of the array:

  • Static arrays (fixed-size, e.g., C int arr[10]) cannot grow, so inserting beyond capacity requires creating a new array, costing O(n).
  • Dynamic arrays (e.g., JavaScript Array, Python list, Java ArrayList) usually have extra capacity, so appending at the end is O(1) amortized.
  • Deleting the last element is generally O(1) since no shifting is required.

Insert/Delete in the middle of the array:

  • All elements after the insertion/deletion point must be shifted to maintain contiguity.
  • This shifting leads to O(n) time complexity in both static and dynamic arrays, regardless of the array’s capacity.

Below you can find a table that summarize all the info above and show the time complexity of common array operations:

OperationAverage CaseWorst Case
AccessO(1)O(1)
SearchO(n)O(n)
Insert (end)O(1)O(n)
Insert (middle)O(n)O(n)
Delete (end)O(1)O(n)
Delete (middle)O(n)O(n)

Classification of Arrays

Arrays can be classified in two main ways:

  1. Based on Size

    • Static Arrays: The size of the array is fixed at compile time and cannot be changed later. These arrays are simple and efficient but lack flexibility.
    • Dynamic Arrays: The size of the array can change at runtime. When the array reaches its capacity, it automatically resizes to accommodate more elements. This provides flexibility at the cost of some performance overhead during resizing.
  2. Based on Dimensions

    • One-dimensional Arrays: A linear list of elements accessed using a single index.
    • Multi-dimensional Arrays: Arrays with two or more indices. For example, a two-dimensional array represents data in rows and columns (like a matrix), while higher dimensions extend this concept to 3D or beyond.

Static vs Dynamic Arrays

An array can be either static or dynamic, depending on whether its size is fixed or flexible at runtime.

Static Arrays

The size of a static array is fixed at compile-time or at allocation, and cannot change during execution. The memory is allocated once, and it remains contiguous throughout the life of the array. These arrays are often allocated on the stack (in languages like C) or in a fixed block of memory. Because there is no overhead to resizing, operations like indexing are extremely efficient (O(1) in time). Static arrays are simpler and safer in contexts where the maximum size is known in advance.

The advantages of static arrays:

  • No resizing overhead or runtime allocation cost.
  • Memory layout is predictable.
  • Lower overhead in memory management (no reallocation).

The disadvantages of static arrays:

  • Inflexible: cannot grow beyond the allocated size.
  • Potentially wasteful if the allocated size is much larger than what’s used.
  • Not suitable when data size is dynamic or unknown.

Dynamic Arrays

A dynamic array can grow (or sometimes shrink) during runtime, adjusting to the number of elements it needs to hold.
Internally, a dynamic array maintains a capacity, that is the total allocated space, and a size, that is the number of elements in use.
When the size reaches the capacity and a new element is added, the dynamic array will resize.
The typical pattern is:

  • if size == capacity, allocate a new buffer, usually of size capacity * 2.
  • copy all existing elements to the new buffer.
  • free the old buffer (or let GC collect it).
  • continue adding the new element.

This doubling strategy ensures that while a single resize is expensive (O(n)O(n)), over many insertions the amortized cost per insertion remains O(1)O(1).

Dynamic Array Visualization
1
2
3

Capacity: 4, Size: 3

The advantages of dynamic arrays are:

  • flexible: can grow as needed.
  • efficient amortized insertion at the end: even though occasionally resizing costs O(n)O(n), the average cost per insertion remains O(1)O(1).
  • better suited to real-world usage with unpredictable data volume.

The disadvantages of dynamic arrays are:

  • resizing costs (allocation + copying) can be expensive.
  • memory overhead: capacity might exceed actual size, causing unused (wasted) space.
  • in certain scenarios, repeated resizing or too-aggressive resizing strategy can degrade performance.

Summary Table

FeatureStatic ArrayDynamic Array
SizeFixed at compile/alloc timeFlexible, can resize during runtime
Memory allocationOnce, contiguousMay reallocate and copy
Access timeO(1)O(1)
Insertion at endOnly if free slot exists, otherwise not possibleAmortized O(1); worst-case O(n) when resizing
Memory overheadMinimal (exact)Potential extra (unused capacity)
Use caseWhen maximum size is knownWhen size is not known or variable

Examples

Below you can find examples of both static and dynamic arrays in various programming languages.

Static Arrays

C

int arr[5] = {1, 2, 3, 4, 5}; 

Java

int[] arr = new int[5]; 
arr[0] = 1;
arr[1] = 2;

Dynamic Arrays

Java

import java.util.ArrayList;
ArrayList<Integer> arr = new ArrayList<>();
arr.add(1);
arr.add(2);
arr.add(3);

Python

arr = [1, 2, 3, 4, 5] 
arr.append(6)

TypeScript

let arr: number[] = [1, 2, 3, 4, 5]; 
arr.push(6);

Swift

var arr: [Int] = [1, 2, 3, 4, 5] // Swift arrays are dynamic
arr.append(6)

Kotlin

val arr = arrayListOf(1, 2, 3, 4, 5) // dynamic array
arr.add(6)

Cache Locality and Performance

One of the main reasons arrays are considered fast compared to other data structures (like linked lists) is because of cache locality.

What is Cache Locality?

Modern CPUs rely on a memory hierarchy:

  • Registers (fastest, smallest)
  • Cache (L1, L2, L3)
  • Main Memory (RAM)
  • Disk (slowest)

Accessing data in cache is magnitudes faster than accessing RAM.
Cache locality refers to the idea that data close together in memory is more likely to be accessed quickly because the CPU fetches it into cache in blocks called cache lines (typically 32–128 bytes).

Why Arrays Benefit

Arrays store elements in contiguous memory locations.
This means when the CPU loads one element, it also pulls in its neighbors into the cache. Iterating sequentially over an array is therefore very fast. In contrast, data structures like linked lists scatter nodes throughout memory. Even though the logical order is preserved, each access may require fetching from different memory locations, causing cache misses and degraded performance.

Types of Locality

There are two important types of cache locality:

  • Spatial Locality: accessing elements stored close together (e.g., arr[i] then arr[i+1]). Arrays excel here.
  • Temporal Locality: reusing the same element multiple times within a short time frame (e.g., repeated access to arr[i]).

Arrays can take advantage of both, but spatial locality is where they shine the most.

Example

Suppose we iterate through an array of 1 million integers:

const arr: number[] = Array.from({ length: 1_000_000 }, (_, i) => i + 1);
let total = 0;

for (let i = 0; i < arr.length; i++) {
  total += arr[i];
}

console.log(total);

This loop runs very fast because of sequential access: • When the CPU accesses arr[0], it loads a cache line containing several consecutive elements. • Subsequent accesses like arr[1], arr[2], etc., are already in the cache, so the CPU does not need to fetch from main memory. • This reduces memory latency drastically compared to non-contiguous structures like linked lists. • Even though the loop performs 1 million operations, the effective memory fetch cost is much lower thanks to spatial locality, which allows the CPU to process multiple elements per cache line fetch.

Cache locality explains why algorithms with the same time complexity (e.g., O(n) traversal of an array vs linked list) can have very different real-world performance.

Arrays as Building Blocks

Arrays form the foundation for other data structures:

  • Stack (LIFO): push/pop on top of array.
  • Queue (FIFO): enqueue/dequeue using circular buffer.
  • Hash Tables: arrays for buckets.
  • Heaps: binary heap stored as array.

Techniques

Here are the main array techniques. Some of them will be covered deeply in later articles

  • Two-Pointer Technique, used for problems involving pairs or subsequences.
  • Prefix Sum, helps compute cumulative sums efficiently.
  • Sliding Window, useful for subarray problems with constraints.
  • Hashing, use hashmaps for frequency or fast lookup.
  • Sorting, sometimes sorting simplifies the problem.

Exercises

ExerciseTechniqueSolution
Move ZeroesTwo PointersSolution
Majority ElementHashing / Boyer-Moore VotingSolution
Remove Duplicates from Sorted ArrayTwo PointersSolution
Best Time to Buy and Sell StockSliding WindowSolution
Rotate ArrayArray RotationSolution
Product of Array Except SelfPrefix ProductSolution
Best Time to Buy and Sell Stock IIGreedy / Sliding WindowSolution
Number of Zero-Filled SubarraysCountingSolution
Increasing Triplet SubsequenceGreedySolution
First Missing PositiveIndex Mapping / In-Place HashingSolution
Animazioni attivate
chat with fabrizio