I was wondering if anyone knows what choosing to be a business major in college entails, and what sorts of jobs I would be able to get once I finish college. Some professions seem clear cut to me, like lawyers work for law firms and doctors work for hospitals, teachers work for schools, etc. And it seems like those jobs are always easy to find and readily available. But for a business major, I have no idea how to get a job once college is over. Does anyone know?