Surprise! If you’re reading this, the end of the world—predicted to occur on Dec. 21, 2012, as the 5,125-year-long count from the Mayan calendar came to an end—did not take place.
Unfortunately, that means you’re going to have to live through the introduction of Windows 8, which may prove equally apocalyptic. Remember Windows XP? It was a reliable workhorse, and most businesses refused to give it up, no matter how hard Microsoft pushed its successor, Windows Vista. Most Vista machines were sold to unsuspecting consumers: When you buy your new HP, Toshiba or Dell computer at Best Buy, Costco or some other retailer, you don’t get much choice in the matter. And that’s the way it will be with Windows 8.
Here’s an opinion from Philip Greenspun: “Suppose you’re an expert user of Windows NT/XP/Vista/7, an expert user of an iPad and an expert user of an Android phone…you will have no idea how to use Windows 8.” Usability expert Jakob Nielsen goes further with a scathing review titled “Windows 8: Disappointing Usability for Both Novice and Power Users.”
As always, there are different flavors of Windows 8: Professional and Enterprise, intended for businesses, and the non-Pro version, intended for home users. And, like always, home users take it on the chin: You can “downgrade” a copy of Windows 8 Professional to Windows 7, but not the basic version aimed at consumers. If you don’t have a choice in the matter, you may be interested in RetroUI, a $5 product that lets you restore the feel of Windows 7 to your Windows 8 system.
It’s tough for Microsoft, since it relies on shipping new versions of Windows and Office to fuel sales. In theory, a new version should bring improvements that users want. In practice, changing things for the sake of change has a huge cost to businesses (and everyone else) who (at best) must learn a new way of doing things. At worst, the changes actually are worse than what they replace.
It’s interesting to note that Steve Sinofsky, the head of the Microsoft division responsible for Windows 8, left the company not long after its release. And the uptake of Windows 8 products has thus far been disappointing. My New Year’s recommendation is to avoid Windows 8 for the time being.
MOOCs
One of the things you’re sure to read more about in 2013 is Massive Open Online Courses, or MOOCs (prounced “mooks”). MOOCs first gained media attention in fall 2011 when 160,000 people signed up for a free course titled “Introduction to Artificial Intelligence,” taught by Peter Norvig (director of research at Google) and Sebastian Thrum (leader of Google’s self-driving car team), using materials from their introductory course on artificial intelligence (AI) at Stanford University:
“The objective of this class is to teach you modern AI. You learn about the basic techniques and tricks of the trade, at the same level we teach our Stanford students. We also aspire to excite you about the field of AI. Whether you’re a seasoned professional, a college student or a curious high school student, everyone can participate.”
The course offered two tracks: a basic track that didn’t require homework or exams, and an advanced track with weekly homework, a midterm and a final exam. Participants who successfully completed the course received a certificate of completion (not issued by Stanford, however). Advanced track participants also received a completion percentage based 30 percent on homework, 30 percent on the midterm and 40 percent on the final, as well as their percentile ranking relative to all the other advanced track students. You see all the details (including the lectures and assignments) at www.ai-class.com.
Since then, a lot has happened in the world of MOOCs: MIT and Harvard have founded edX, a nonprofit that offered seven courses in fall 2012 to a worldwide audience at no charge. Coursera, a commercial startup with strong ties to Stanford, is currently offering 208 courses in partnership with 33 universities, including Caltech, Princeton, UCSF and Columbia. After the success of “Introduction to Artificial Intelligence,” Sebastian Thrum co-founded Udacity, which has developed a smaller number of courses focused on computer science, math and physics. The difference is that Coursera is adding (limited) interactivity to existing video-based lecture content, while Udacity is creating courses specifically for online presentation. This explains both the type and quantity of courses available from each company.
It’s great to see university-level courses being made available at no charge to anyone who’s interested (and has online access). But it begs the question: If you took all the courses online for free that a Harvard or Stanford undergraduate takes on campus (and pays for dearly), do you have an equivalent education? What if, rather than showing online students their percentile rank among other online students, courses showed a student’s percentile rank among all students taking the course? What if the some of the top students were online students? And if you can be taught by the best teachers in the world, regardless of where they are, what does that mean for mediocre teachers and university programs? MOOCs represent the disruptive power of the Internet for higher education.
As employers, we use a college degree as a proxy for the intelligence and dedication of a potential employee. Certainly, I’ve benefited from my Harvard degree, even though I received some grades I’m not proud of (I did graduate cum laude, in case you’re wondering). Is an applicant without a degree, who took the same courses and ranked the same as a student with a degree, equally qualified/intelligent/dedicated? Are they worth the same pay? If not, why? Your answers depend a lot on how you view employees and education. One thing is certain: These questions are going to become a lot more important during the next decade.
Author
-
Michael E. Duffy is a 70-year-old senior software engineer for Electronic Arts. He lives in Sonoma County and has been writing about technology and business for NorthBay biz since 2001.
View all posts