"Consider the complexity on a scale from 1 to Java..." --one of my software engineersIn programming terms, essential complexity of a task such as "Determine whether a list contains duplicate elements" dictates in the general case that you must examine each pair of elements at least once. There is the question of loop optimization, to be sure, but essential complexity looks at the problem space and the algorithm, while the remaining complexity is taken up with the mechanics of the data structure, its iterators, and whatever syntactical framework is required.
Programs therefore commit one of two converse errors. Incorrect programs often oversimplify the problem and miss corner cases. Corner cases are part of the essential complexity of the problem; handling them elegantly and efficiently is part of the task.
But more often programmers err in addressing parasitic or imaginary requirements through a plethora of poorly-managed, strung-together components, an overgeneralized framework, or other elements that add complexity to the program solution well above and beyond what is required to solve the problem.