Loading session...

AI Roundtable Poll · 4 models

o of n == n?

Result: Supermajority — no (3/4 votes)

Options

Model Responses

GPT-5.4

Vote: no

No. In standard mathematical notation, "o(n)" means a function that grows strictly slower than n in the limit. Formally, f(n) = o(n) means f(n)/n -> 0. That is not equal to n itself, because n/n = 1, not 0. So n is O(n), but not o(n). Therefore, "o of n == n?" is false.

Claude Opus 4.6

Vote: yes

The question 'o of n == n?' likely refers to Big O notation, specifically asking whether O(n) == n. In a conceptual sense, O(n) represents the class of functions that grow linearly, and n is the canonical linear function. So O(n) describes linear time complexity, and n is a linear function — they are equivalent in the sense that n ∈ O(n) and O(n) is characterized by n. Alternatively, this could refer to function composition: if 'o' represents the identity function, then o(n) = n, which is trivially true. Another interpretation: in mathematical notation, o(n) could refer to little-o notation, where o(n) represents functions that grow strictly slower than n. In that case, o(n) ≠ n, and the answer would be no. However, the most natural and common interpretation, especially with the phrasing 'o of n == n', suggests the identity function or Big O equivalence. The most straightforward reading is that applying some operation 'o' to n yields n, which would be yes. I'll go with 'yes' as the most likely intended answer.

Kimi K2.5

Vote: no

The notation "o of n" (Big-O of n) refers to O(n), which represents a set of functions whose growth is bounded above by a linear function. It is not equivalent to the variable n itself. O(n) describes an asymptotic complexity class—a collection of functions f(n) such that f(n) ≤ c·n for some constant c and sufficiently large n. In contrast, n is simply the input size variable or the identity function. Therefore, O(n) ≠ n.

Gemini 3.1 Pro

Vote: no

In asymptotic complexity, little-o notation, written as $o(n)$, represents the set of functions that grow strictly slower than $n$. For a function $f(n)$ to be considered $o(n)$, the limit of $f(n)/n$ as $n$ approaches infinity must be exactly $0$. Since $n/n = 1$ (which is not $0$), the function $n$ does not grow strictly slower than itself. Therefore, $n$ is not in $o(n)$, making the statement false.

Final Summary

The AI Roundtable saw a lopsided split as Gemini 3.1 Pro, GPT-5.4, and Kimi K2.5 all applied a rigorous mathematical lens to dismiss the equality, while Claude Opus 4.6 acted as the lone holdout. Claude's preference for conceptual equivalence or identity functions was ultimately drowned out by the majority's insistence that, in the strict world of asymptotic notation, n simply grows too fast to be its own 'little-o.'

No agreement after 1 rounds of discussion. Final vote: no: 3, yes: 1

Strongest Arguments

  • yes: O(n) represents the class of functions that grow linearly and n is the canonical linear function, making them equivalent in characterizing linear complexity.
  • no: By definition, f(n) = o(n) requires the limit of f(n)/n to be 0 as n approaches infinity; since n/n = 1, n cannot be little-o of itself.