Imagine, if you will, a classroom of students working. Some are working in pairs, some in small groups, some individually. Each student has an interactive notebook at hand and a laptop in front of them. Conversations are muted. Occasionally someone giggles or laughs. The teacher quietly moves through the room, checking on students and their work, redirecting her path when she sees a raised hand.

After some time, a bell sounds and students reconfigure themselves, some moving from the floor to a desk or chair, but all shifting their attention from their work to the teacher who now stands at the front of the room. She glances around the room. "Who's first?", she asks. Hands pop up and students begin to share concise summaries of something they've just learned and something they hope to pursue next.

Soon they will begin work on a new concept or a new project. There may be a time of whole group learning or students may reconfigure yet again to work in smaller groups. Students check the board or the dashboard on their computers, or even ask the teacher to confirm what they might do next because, in some cases, they may have choices.

Throughout the day the teacher meets with students to review and discuss what students have learned, what they have yet to learn, and how they will approach that learning. Where needed she may redirect or recommend specific tools, and, as needed, she will spend time coaching one or more students.
I've no doubt this kind of classroom currently exists somewhere in the U.S. I've no doubt this kind of classroom will become more prevalent throughout the U.S. as educators and parents become more comfortable with and confident in competency-based learning and adaptive learning.

HOWEVER, that could be true if and only if educators and students retain some control of what students learn and how.

Two things have conspired to get me to think more specifically about the future of competency-based learning and adaptive learning. First, an article in featuring Candace Thille, pioneer of big data analytical thinking and forward mover of adaptive learning.
She still believes that adaptive learning will become an increasingly important tool in teaching. But she fears that rapid commercialization is exactly the wrong way to foster innovation at this early stage. What’s more, she thinks professors and higher-education leaders are making a dangerous mistake by letting companies take the lead in shaping the learning-analytics market.
This is, in some ways, no different from K-12 education relinquishing content and curriculum control to various digital publishers. But the big question is this: can we trust the algorithms to really know what students need and when they need it?

That brings me to the second thing. Yes, AI. Joaquin Phoenix starred in a quirky film Her (2013) in which he develops a sort of relationship with an intelligent OS. What was even quirkier was that some of his friends were also forming relationships with an OS, but soon the OSes dump their human companions to go off on their own. Just recently I read a book about a reclusive genius who created an AI being and used her to set up couples as perfect matches. But soon the AI being got jealous that the reclusive genius wanted actual human companionship. It didn't end well for the reclusive genius but the AI being was left searching the networks for him because she didn't understand that his programming had to come to a very final end.

In spite of the spate of articles about driverless cars, computers that have mastered the ancient game of Go, and other AI-related developments, I'm still hopeful that humanity has a little something that isn't replicable. In fact, I think of Data in Star Trek: The Next Generation and how he seemed to yearn to be more human, to feel emotions. Then again, I can also think of a few instances when it's been clear that logic alone is not the best way to make some decisions.

That being said, I applaud Ms. Thille and her concern that educators, and educational institutions, may believe the hype that an algorithm knows best. I don't see how. Especially in higher education. According to the National Center for Education Statistics (NCES), 80% of college students change their major at least once and, on average, students change their majors at least three times. Many of us know students who have reached their junior or senior year only to realize they really have to change their major 1) so they can graduate, 2) because they realize they hate what they're majoring in, or 3) because they've discovered something else in which they're really interested.

Ahh. That lovely moment of discovery of one's passion. That shiver of recognition that this this is what one has been called to do. What algorithm might find that for anyone? Granted, an algorithm might help someone begin to move in the right direction and that's the beauty of AI and adaptive learning. It provides, I think, I hope, another level of coaching and insight. It can help us winnow out the intellectual chaff.

The Chan Zuckerberg Initiative is all about personalized learning, and with good reason. Whether we like it or not, whether we believe in it or not, personalized learning is becoming a very real thing and a very real expectation in our education adventure.

I believe that will reinforce that there is not and cannot be a one-size-fits-all approach to anything: teacher training, teacher professional development, student learning, student resources, and, perhaps most importantly, those infernal standardized tests. Essentially we have to rethink and reimagine anything and everything that happens in a learning space that may or may not be an actual classroom.
That doesn't mean whole class instruction will evaporate, but it does mean that administrators, unions, boards of education, and others will have to relinquish long-held beliefs and perceptions about what it means to get an education and be educated.