This essay examines the evolving role of community college in the American higher education system.
The two-year community college idea originated in the US and can be traced back to the latter part of the 19th century and the early part of the 20th century. Since its beginning, the community college program has been regarded as a significant part of the higher education system in the U.S. A formal definition of community colleges is expressed as follows:
The term community college has also been used interchangeably with terms such as “junior college”, technical colleges and alternative colleges (Santos & Santos, 2006, p. 38). All indications are that community colleges are perceived as separate, but yet connected to the conventional four-year college system.
As Santos and Santos (2006) explain, community colleges make provision for “comprehensive curricular offerings” inclusive of “academic transfer preparation, vocational-technical education, continuing education, developmental education and community services” (pp. 38-39). By the 1990s, community colleges became increasingly linked to vocational and workforce training and development (Santos & Santos, 2006). According to Baum, Little and Payea (2011), community colleges are perceived as “the access point to higher education for many students” (p. 1).
In particular, access to higher education in the U.S. continues to be a problem as the socio-economically disadvantaged can rarely afford the high cost of tuition at accredited four-year colleges. Lower tuition and lower admission requirements have always made community colleges an alternative to this underserved population of Americans aspiring to achieve a post-secondary education. For the most part, students entering community colleges do so with a view to transferring over to a four-year college eventually (Beach, 2010).
Conventional wisdom dictates, that once a student completes a two-year college degree and can