Programming is, ultimately, self-taught. You can take classes to help that process along, but eventually you have to form and refine your own mental models of the language, the machine, and your program. If you're not spending time outside of class doing this work, you're memorizing trivia, not learning to program.
The audience for computer code is other human beings, in particular, you six months from now. Be considerate to these people.
Coding is easy, debugging is hard. Code to make debugging trivial. That can be test cases, source control, smaller and well-defined functions.... basically all of the "best practices" out there speak to this.
A debugged, tested, and documented line of code costs on the order of $100. The best code is code that's never written because you were able to come up with a non-coding solution or you reused existing code. The second-best solution involves deleting code to get the functionality you need. Remember that as a programmer, you're hired to solve problems, not write code.
Back when you were learning to read and write, you spent far more time reading than writing. Current pedagogy emphasizes writing code over reading code --- what was the last programming class you took where you started with an existing 1k line program and read it? As a professional, you'll be spending most of your time reading code you didn't write in order to find a bug or add a feature. It's a skill worth developing early.