The unit with which we measure student credit for learning is based on an idea originally conceived as a way to substantiate teacher pensions (source).
Andrew Bryk writes:
Early in the twentieth century, the industrialist Andrew Carnegie established the Carnegie Foundation for the Advancement of Teaching to create a pension system for the nation’s college professors. The introduction of this pension system proved an ingenious educational reform. At the time, American higher education was a largely ill-defined enterprise with the differences between high school and colleges often unclear.
To qualify for participation in the Carnegie pension system, higher education institutions were required to adopt a set of basic standards around courses of instruction, facilities, staffing, and admissions criteria. The Carnegie Unit, also known as the credit hour, became the basic unit of measurement both for determining students’ readiness for college and their progress through an acceptable program of study. Over time, the Carnegie Unit became the building block of modern American education, serving as the foundation for everything from daily school schedules to graduation requirements, faculty workloads, and eligibility for federal financial aid.
This notion of time raises meaningful questions in the 21st century. In other words, how does the Carnegie Unit align with learning and asynchronous teaching?
When the Carnegie Unit was established, the asynchronous style of learning did not exist. So, what now?
Educational policy makers need to consider how to both honor student achievement and faculty labor outside of the constraints of the clock.
How we do this effectively in order to meet the needs of non-traditional college students?
See also [Many Paths To & For Personalized Learning]