Often, when I try to describe mathematics to the layman, I find myself struggling to convince them of the importance and consequence of 'proof'. I receive responses like: "surely if the Collatz Conjecture is true up to $20 \times 2^{58}$, then it must always be true?'; and "the sequence of number of edges on a complete graph starts $0,1,3,6,10$, so the next term must be $15$ etc".
Granted, this second statement is less logically unsound than the first since it's not difficult to see the reason why the sequence must continue as such; nevertheless, the statement was made on a premise that boils down to "interesting patterns must always continue".
I try to counter this logic by creating a ridiculous argument like "the numbers $1,2,3,4,5$ are less than $100$, so surely all numbers are", but this usually fails to be convincing.
So, are there any examples of non-trivial patterns that appear to be true for a large number of small cases, but then fail for some larger case? A good answer to this question should:
be one which could be explained to the layman without having to subject them to a 24 lecture course of background material, and
have as a minimal counterexample a case which cannot (feasibly) be checked without the use of a computer.
I believe conditions 1. and 2. make my question specific enough to have in some sense a "right" (or at least a "not wrong") answer; but I'd be happy to clarify if this is not the case. I suppose I'm expecting an answer to come from number theory, but can see that areas like graph theory, combinatorics more generally and set theory could potentially offer suitable answers.