Should it be mandatory have a teaching degree? I really hope I don't offend anyone with this post. The issue I am about to write about really bothers me. I live in NC. In NC they have something called Lateral Entry. That means someone with a bachelor's degree can take about 6 classes to become a teacher, even an EC Teacher. They can work towards their licensure while they are teaching kids. This offends me greatly. I work with a lot of people who chose to do this. They are lacking in the foundation of teaching. They may know their content knowledge, but they don't know all of the things you learn in order to be an effective teacher. I changed my major half way through college and ended up having to take 152 credits to graduate with a bachelor's degree. (I did it in 4 years too!) I busted my butt in practicums, student teaching, etc. I felt invested in this career before I even started. Now I wonder, what I did all of that for when I could have just graduated with my original degree (social work) and still could have been a teacher. I have no back up plan like these people do. I can't tell you the number of times I have heard someone say, "I can still do xyz and make more money that this, so I'm not worried." It makes me feel like we are not professionals, like people feel like anyone can be a teacher. Think about it. If I decided to be a nurse, I would have to go back to school for nursing and do everything everyone else had to do to get that degree. I couldn't just walk into a hospital, take 6 classes, pass the license testing and be a nurse. It's a process and I hate that everyone doesn't go through it. This is my 7th year teaching, and it still bothers me. What do you think?