No, abstractions don't prevent you from understanding how things work. Abstractions allow you to understand why (to what end) things work the way they do.
First off, let's make one thing clear: pretty much everything you've ever known is at a level of abstraction. Java is an abstraction, C++ is an abstraction, C is an abstraction, x86 is an abstraction, ones and zeroes are an abstraction, digital circuits are an abstraction, integrated circuits are an abstraction, amplifiers are an abstraction, transistors are an abstraction, circuits are an abstraction, semiconductors are an abstraction, atoms are an abstraction, electron bands are an abstraction, electrons are an abstraction, and quarks could be an abstraction too - I don't really know, I'm just making a guess based on the pattern. By the logic that low level knowledge is required to understand how something really works, if you want to understand how computers really work, you need to study physics, then electrical engineering, then computer engineering, then computer science, and essentially work your way up in terms of abstraction. (I've taken the liberty of not mentioning that you also need to study math first, to really understand physics.)
Now realistically, the days when you could make sense of computers and programming by building your way up from the lowest level details were the earliest days of computers. By now, this field has advanced too much, to the point where it can't possibly be rediscovered from scratch by a single person. There are hundreds of thousands of very qualified people specializing at every level of abstraction, working hard daily to make advances that you can't hope to understand without spending years of studying a specific portion thoroughly and committing to keeping up with the latest advancements there.
As an example, consider this Java snippet:
public void Example() {
Object obj = new String("...");
// ...
}
Unless you are well-versed in stack frames, heap data structures, concurrent generational tracing GC, memory compacting, static analysis, escape analysis specifically, virtual machines, dynamic analysis, assembly language and executable space protection, you are wrong if you think you really know what happens in practice when you run something as simple as that snippet.
As another example, consider this C snippet:
void example(int i) {
int j;
if(i == 0) {
j = i * 2;
printf("Received zero, printing %d", j);
} else {
printf("Received non-zero, printing %d", j);
}
}
If you show it to a beginner, they'll tell you that when the argument is non-zero, the residual contents of a memory location will be printed, because variables are actually just memory addresses behind the scenes and when you don't initialize them it's just that you don't move anything to their address and anyway there's nothing magic about them. If you show it to a non-beginner, they'll tell you that this program's behavior is undefined for non-zero values of this function's parameter and the compiler could potentially remove the conditional, treat all arguments to this function as zero, replace all of the function's call spots with calls that pass zero as the argument, set all variables that are ever passed as arguments to this function to zero, or do other paranormal stuff.
The beginner in this example took everything he/she knows into account and arrived elegantly to a completely wrong answer because (a) he/she didn't read the spec of the language (which is an abstraction on purpose, not because C programmers aren't clever enough to understand computer architecture) and (b) tried to reason about implementation details which he/she didn't fully grasp and which have evolved way beyond his/her mental model by now. This example is fictional, but it draws from everyday real-world misconceptions - the kind that sometimes lead to perilous bugs and occasionally famous security holes.
You're saying you aim to be a better programmer. Some of the best programmers I've met are the way they are because they understand abstractions and they can carry their knowledge to any language, and adapt it to any problem they need to solve, at any level they happen to be working on. Some of the worst programmers I've met are the way they are because they insist on focusing on details and trivia which they don't really understand and which a lot of the time aren't exactly up-to-date, or relevant to the problem, or applicable in the context they attempt to use them in, or have never been true in the first place.
Don't fall into the trap. After all, there is no single level of abstraction at any given point and a person isn't limited to understanding a single level of abstraction at any given moment. You can understand one, then move on to another.