Should Educators Be Concerned About Web 3.0?
Even Web 2.0 is a confusing mass of capabilities, yet already people are talking about Web 3.0. Where are we in all of this? What's important for educators to know?
Technology definitions for Web 2.0 may not convey a clear picture of the significance of '2.0' for educators. Tim Berners-Lee, who invented the Web, says there is no such thing as Web 2.0 because 2.0 capabilities were anticipated in Web 1.0, and that would mean what we're referring to as Web 2.0 should be called simply 'the Web.' But, in fact, from a cultural perspective, Web 2.0 has been a major event. In this century, the original Web matured and many new technologies became more widely and easily available, with much more accessible user interfaces.
The social Web is very, very different for users in higher education than the display Web of the 1990s. Here are some examples:
1. In truth, now, almost anyone can build a Web page and personalize the design and add content. Even some updating features can be automated, which reduces the task of making your page look constantly current.
2. Almost anyone can build a social site, which is usually just a template-based Web page with functions like a blog, groups, broadcasts, importing from other social sites, member profiles, and dozens more.
3. Facebook has created a schmooze space not just for the general public but, interestingly, for academics. I can read that one academic friend is amazed by snowfall in Seattle, of all places, and another academic friend is stuck in Newark Airport.
4. Immersive environments like Second Life and Wonderland allow us to re-clothe our virtual selves and lead a, well, second life.
5. E-mail accounts are available just about anywhere for free, and you keep your e-mail on the Web where it will be universally available.
6. We can collaborate on documents at Google Docs so that all collaborators can keep up with changes no matter where they are.
7. I can Skype with friends in Australia on New Years Eve (New Years Day for them) and see them in tee shirts sweating on a deck in the middle of their summer vacation. (A bit disconcerting).
8. We expect that no matter where we go we can find a WiFi spot, or at least use our iPhone to stay in touch.
9. We know we can see a video of our granddaughter on You Tube just seconds after the video was taken.
We have all become Webizens. Just a year ago in this newsletter I was writing about Web 2.0 with the sense that many readers would have no idea what it was. But now defining it culturally, as I am doing, is like describing coffee shops or the movies. We have been enculturated, or, better said, we have enculturated Web 2.0 technologies in just the past one or two years. Web 2.0 was the cultural tipping point for acceptance of the Web.
Now that we find ourselves here in this strange undiscovered land, what do we do? How can academics push these wonderful new technologies in useful ways? Most significant for academia is the move toward the Semantic Web, the Web that is explored not by chance character-string hits but by the meaning we put into our search terms. Browsers will look for pre-organized or tagged information that fits the meaning we seem to be looking for.
The pre-organized option depends on us creating ontologies, or collections, of information ahead of time and then maintaining those ontologies over time. The tagged option means that we use technology to normalize tags so people can select tags for new information that is being added to the Web. The tags would be considered good because the wisdom of the crowd says they are. (In practice, when you go to add a tag, you see tagging options that other people have used.) Good tagging can also lead to successful semantic searches. My guess is that tagging is already becoming the default choice, not ontologies. Ontologies take a lot of work, naturally, and perhaps only large, well-staffed organizations will maintain ontologies.
Students should start to learn not just how to search for stuff "out there" (by developing good research skills), but also know how to pre-classify their own stuff before it goes out there--so it can be added to the culture's store of knowledge. These are two discrete skills that are the survival tools of Web 3.0--and for 2.0 as well, since tagging is already an option. Students tagging their own work is a fascinating way for them to reflect on the core meaning of what they've just created. But this is a skill that needs to be taught in most cases.
Web 3.0 is the scholarly Web just as 2.0 is the social Web. And, in my view, what has been called 3.0 is really a refinement of 2.0 and so should be called Web 2.1. Higher education was blind-sided by 2.0, but it can really take the lead on 2.1.