Code with 100% test coverage can be terrible and it is important to realize that writing good code is the main objective. If a solution uses the wrong approach tests are also likely to be bad (imagine a messy nest of mocks and stubs), which puts more crap in the codebase.
Tests are great insofar as they encourage better code to be written. Testing bad code is hard, so TDD may force developers to refactor crappy methods (or do crazy things in the tests and keep the bad methods unchanged). But automated code review tools like CodeClimate can also tell you about the crappy methods that need to be refactored, so tests are not required to identify bad methods. Tests also do not teach object oriented design and will not help a programmer that doesn’t know how to write good code in the first place.
It seems like a lot of developers don’t know the basics of object oriented design, so it will be hard for them to implement a technique like dependency injection without first reading Practical Object Oriented Design with Ruby. A brilliant programmer may be able to deduce object oriented design principles from personal experience, but it is easier to go to the source and simply learn object oriented design. All the coding schools seem to emphasize testing and ignore object oriented design, so I guess my opinion is unpopular.
Coding schools seem to think it is best to teach testing to absolute beginner programmers writing their first “Hello World!” app. I think it is better to teach beginners how to code first, then teach object oriented design, and then teach testing. Simply focusing on implementing object oriented design techniques and leveraging automated code review tools can yield better code for a significant subset of programmers. Tests are great, but it is better to simply focus on learning how to write good code first.