Learn

In mathematics and computer science, an algorithm is a finite sequence of mathematically rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can use conditionals to divert the code execution through various routes (referred to as automated decision-making) and deduce valid inferences (referred to as automated reasoning).

Coursework Overview

The courseworks in this module ask you to work with collections of data — lists of numbers, names, scores, or records. In almost every task you will need to either locate a specific value inside a collection or arrange the collection into a meaningful order before you can do anything useful with it. Choosing the right algorithm affects how quickly your program runs, how much memory it uses, and whether it even produces the correct answer in a reasonable time. The four algorithms below are the building blocks the courseworks are designed around.

Searching Algorithms

A searching algorithm answers the question: "Is this value in my list, and if so, where?" We cover two approaches. Linear Search is the simplest and works on any list. Binary Search is far faster but requires the list to already be sorted.

Linear Search

Core Idea

Linear Search scans every element of a list from the beginning until it either finds the target or exhausts the list. There is no trick — it is a plain left-to-right sweep.

Coursework relevance: When data is unsorted (e.g. raw user input, an unordered database result), Linear Search is often the only correct option. Many first-year courseworks give you an unsorted list and ask you to count, find, or filter values — Linear Search handles all of these.

Key Properties

  • Pre-requisite: None — works on any list, sorted or unsorted.
  • Best case: The target is the very first element — only 1 comparison needed.
  • Worst case: The target is last, or not present — the algorithm checks every element.
  • Time complexity: O(n) — performance scales linearly with list size.
  • Practical rule: Fine for small lists; becomes slow on lists of thousands of items.
Mental model: Imagine looking for a specific card in a shuffled deck by flipping through every card one at a time. Simple, but possibly slow.
Binary Search

Core Idea

Binary Search exploits the fact that a sorted list has a known structure. Instead of checking elements one by one, it looks at the middle element and immediately discards half the list based on whether the target is smaller or larger. It repeats this halving process until the target is found or the search space collapses to zero.

Coursework relevance: Courseworks that involve sorted data — ordered scoreboards, alphabetical name lists, ranked results — reward using Binary Search. When a task asks you to "efficiently find" a value, Binary Search is usually the expected answer.

Key Properties

  • Pre-requisite: The list must be sorted before you start.
  • Best case: The target is the middle element on the first check — 1 comparison.
  • Worst case: The list is repeatedly halved until one element remains.
  • Time complexity: O(log n) — doubling the list size adds only one extra comparison.
  • Practical rule: Dramatically faster than Linear Search on large sorted lists.
Mental model: Think of searching for a word in a dictionary. You open to the middle, decide which half contains your word, open to the middle of that half, and repeat. You never read every page.

Sorting Algorithms

A sorting algorithm answers the question: "How do I arrange this list into order?" Sorted data is easier to search, easier to display, and often required by the problem specification itself. We cover two algorithms: Bubble Sort, which is easy to understand and implement, and Merge Sort, which is far more efficient for large lists.

Bubble Sort

Core Idea

Bubble Sort makes repeated passes through the list. On each pass it compares neighbouring pairs of elements and swaps them if they are in the wrong order. After the first pass, the largest value has "bubbled" to the end. After the second pass, the second-largest is in place — and so on. The algorithm stops when a complete pass produces no swaps, meaning the list is fully sorted.

Coursework relevance: Bubble Sort is the algorithm most courseworks expect you to implement from scratch first, because its logic maps almost directly to pseudocode. Tasks asking you to sort a list of scores, names, or records in ascending or descending order are typically solved with Bubble Sort at the foundation level.

Key Properties

  • Pre-requisite: None — works on any unsorted list.
  • Best case: List is already sorted — one pass confirms it, no swaps performed.
  • Worst case: List is in reverse order — maximum number of passes and swaps needed.
  • Time complexity: O(n²) — for large lists, performance degrades rapidly.
  • Practical rule: Straightforward to write and debug; too slow for very large datasets.
Mental model: Imagine sorting a row of people by height. You walk along the row, swap any two neighbours who are out of order, then walk back to the start and repeat until nobody needs swapping.
Merge Sort

Core Idea

Merge Sort uses a divide-and-conquer strategy. It repeatedly splits the list in half until every sub-list contains only one element (a single element is trivially sorted). It then merges pairs of sorted sub-lists back together in the correct order, building up to a fully sorted list. The merge step is the key: combining two already-sorted halves into one sorted list is efficient and straightforward.

Coursework relevance: When a coursework deals with large datasets or asks you to consider efficiency, Merge Sort is the right tool. It also appears in questions that ask you to explain or trace a divide-and-conquer approach, and it is the conceptual basis for understanding how professional sorting libraries work.

Key Properties

  • Pre-requisite: None — works on any unsorted list.
  • Best case: O(n log n) — even a perfectly ordered list takes this many operations.
  • Worst case: O(n log n) — performance is consistent regardless of initial order.
  • Time complexity: O(n log n) — significantly faster than Bubble Sort on large lists.
  • Trade-off: Requires extra memory for the temporary sub-lists during merging.
Mental model: Imagine splitting a messy pile of numbered cards in half, then each half again, until you have individual cards. Then merge pairs of cards into sorted pairs, pairs into sorted groups of four, and so on — until the whole pile is sorted.

Algorithm Comparison at a Glance

Algorithm Type Requires sorted list? Time complexity Best use case
Linear Search Searching No O(n) Small or unsorted lists
Binary Search Searching Yes O(log n) Large sorted lists
Bubble Sort Sorting No O(n²) Simple tasks, small lists
Merge Sort Sorting No O(n log n) Large lists, efficiency required

Want to see some Examples?

Examples >