Open Menu Close Menu

Digital Video and Internet2: Growing Up Together

The simultaneous development of video conferencing technology and Internet2 has enormous implications for education, especially distance learning. Using synchronized streaming media, educators may soon send interactive multimedia presentations to students anywhere in the world, in real time.

After years of hype, talk and development, it looks like Internet2 has finally reached, well, puberty. Imagine being able to fire up your PC and log on to an astronomical observatory in Hawaii from your Vermont classroom to check out the constellation Orion in real time. You almost can. Already, astronomers around the globe can log on and scope the heavens via an I2 connection at the Mauna Kea Observatories Communication Network in Hilo, Hawaii, that links eleven of the world's leading astronomical observatories. Researchers are able to control and view the skies remotely with a powerful telescope that transmits live digital video over a high-speed I2 connection, then disperse and share information through video conferencing technology. While Internet2 and digital video have not yet come of age, the simultaneous development of both technologies is extremely important. For developers and users, an I2 connection means access to streaming video and synchronized streaming media, video-on-demand, video conferencing, tele-immersion, and even interactive simulation. The bigger pipe (with 45,000 times more bandwidth than the current Internet) that I2 provides makes it possible to create rich video content and guarantees its availability and playback quality to users.

While the creation of the Internet has been likened to having reinvented the printing press, Internet2 can be thought of as reinventing the visual experience, using television as its base model, with digital video rather than text being the main form of content. When the existing Internet was developed, the design goal was to allow for asynchronous data exchange across a large distributed network. By contrast, video broadcast and video conferencing require synchronous communication. Initially, audio and video conferencing were accomplished with private, expensive, dedicated lines. With IPv6 (the Internet protocol of I2), some of the benefits of those private networkssuch as the ability to allocate and guarantee bandwidth, also known as Quality of Service (QoS)are now available to I2 participants and will eventually be to everyone. What makes these features so important is that, as users and content creators, we have access to the best of both worldsflexible synchronous and asynchronous exchanges.

So, what can you do with digital video? The range is huge, but there are basically two standard modes of DV: broadcast and conferencing.

Broadcast Digital Video

Broadcast DV encompasses both live broadcast and video-on-demand. Live broadcast works similarly to current television news broadcasting. Content is captured live and transmitted to a user who is watching passively from a box located potentially anywhere. With video-on-demand, the content has been created and uploaded so that the user can choose when he or she is going to view it. While similar, the methods for creating a live broadcast versus video-on-demand have some differences. Both require the same equipment: a DV camera, a computer with plenty of storage, a high-speed connection, and an end user. Live broadcast content is captured on-camera and dumped onto a machine simultaneously (the live feed can also be recorded while being broadcast), then encoded into a readable format (MPEG2, Real, QuickTime, ASF), then transmitted over a network to be viewed by an end user. With video-on-demand, the content is recorded, then captured (dumped onto a computer, usually into a video editing program like Adobe Premier, Apple's iMovie, or Avid Cinema), encoded, then transmitted over a network to be played back by an end user.

Within the two types of broadcasting are two ways to receive the media: streaming and non-streaming. According to a January 2000 New York Times article, streaming technology was touted as the single, most important Internet technology of the coming year. Streaming media, which can include audio, video, animation, and scrolling text, begins playback immediately, downloading as it plays. All live broadcast media is streaming. Going far beyond just a single media stream, the RealNetworks product SMIL (Synchronized Multi-Integration Language) allows for simultaneous streams of several pieces of media. I2's impact on SMIL is immense. Professors will be able to create projects that can stream several clips of video, a background soundtrack, voice-over narration, animation, text, and even forms, at the same time using very small chunks of bandwidth. By comparison, non-streaming media files download completely before they begin to play and are usually stashed on the user's hard drive.

Digital Video Conferencing

While the broadcast mode assumes the end user to be a passive viewer, the conferencing mode, which includes video conferencing, tele-immersion, and interactive simulation, expects that the user will be a participant. Conferencing requires the same equipment as broadcasting, but video conferencing also requires special software that allows for synchronous encoding and transmissionbasically to allow parties on either end of the line to view each other live. The University of Southern California Center for Scholarly Technology recently explored the use of video conferencing in the classroom. The plan was to hold USC Professor Dr. Greg Hise's Los Angeles and Chicago: Comparative Urban Analysis class simultaneously with Professor Robert Bruegmann's sister class at the University of Illinois in Chicago. Both schools had fast I2 pipes, but the video conferencing software used was inadequately designed and unable to take advantage of the vastness of the bandwidth. In the end, the class scaled back on its video conferencing sessions and opted for more traditional lectures and sessions.

"Unfortunately in this case, the pipe was there but the software couldn't handle it," says Rick Lacy, one of the participants in the Greg Hise project. "Many video conferencing products were created assuming that everyone has a slow connection, and the tools just don't adapt. Until the software catches up, we're kind of floating around waiting. Or we have to commit to spending the big bucks." In most cases, those "big bucks" include setting up a television studio-like facility, which most schools don't have the space, means, or staff to create.

Tele-immersion and interactive simulation, which allow a user to actually interact with what he or she is viewing, is already being used in fields like medicine and science. Doctors at a hospital can diagnose patients at a remote clinic site or even at a patient's home. This means that treatment can begin immediately, without the patients ever having to leave their location. When trying to define tele-immersion, think of virtual reality plus full-motion video plus animation plus user interaction. The goal of tele-immersion is for users to be able to interact not only with other users, but also with computer models and environmentspicture a Star Trek Holodeck networked between two or more locations.

UCAID (University Corporation for Advanced Internet Development), the developers behind I2, and the federal leg, Next Generation Internet, are dedicated to creating a virtual environment in which digital video is the highest priority. Internet2 Digital Video is developing an open-source architecture in conjunction with the IETF (International Engineering Task Force) and the IEEE (Institute of Electrical and Electronics Engineers) to create "edge technologies" (easy-to-use, reliable technology for the end user) based on the philosophy that the "complexity should not be at the edge of the network but at the core of the network," according to the UCAID Intenet2 Digital Video Web site. That core technology will be built on national information infrastructures like Abilene and the vBNS (the research networks that have been developed to host I2). In order to effectively proceed with that goal in mind, UCAID has proclaimed these service initiatives:

  • I2 Digital Video Portal, a prototype in which users will be able to search for content within a vast library of videos and within the videos themselves
  • Digital Video Network, started in 1998 to provide high-quality digital video services that will include animation, virtual reality, simulation, and images with audio soundtracks and more
  • Digital Video Conferencing, dedicated to creating QoS on both ends of video conferencing transmissions
  • Digital Video Live Transmission, dedicated to reinventing traditional broadcasting and offering live transmission service over I2 and the current Internet.

These initiatives also coincide with storage, archival, and retrieval initiatives, which UCAID is also working on and dedicated to. At this point, there are no permanent, widespread solutions for archiving. It's not enough just to have bandwidth. In order for the content delivery to be successful, there must be a stable way to store and acquire that content.

Research Channel

One of the most successful programs to experiment with Internet2 thus far is the University of Washington-based Research Channel, which broadcasts original DV content twenty-four hours a day, seven days a week. Content comes from universities and member organizations all over the country and includes lectures, documentaries, seminar series, and interview shows.

Not wanting to limit viewership, the Research Channel offers three choices for accessing content: a slower-speed 28.8K-modem connection; cable, DSL, or T1 line; or across I2.

"WhileResearch Channel has been a pioneer in high-bandwidthdigitalvideo distribution,andthat continues to be our goal, we will also continue to use traditional methods of broadcast, like satellite, so that we don't exclude anyone from seeing content," says the co-founder of Research Channel, Amy Philipson. "We want to offer several options in content quality and delivery. Eventually, we would like to see image, sound, and production quality that's at least as good as existing TV, or better than TV (HDTV)."

Another goal for Research Channel is to bring the content to viewers in other ways, such as the video-on-demand method. "The hope is to expand the palette of options for broadcasting the material. There is a lot of convergence of technology happening, with things like digital networks over cable, set-top boxes, in addition to networked computers. It's an exciting time. We are really at a crossroads," states Philipson. According to Philipson, Research Channel gets about 20,000 hits a month from all over the world, often followed by e-mail from supportive viewers locally and internationally.

Those ratings are great news for the folks at ViDe (Video Development Initiative), whose goals are to promote the use and development of digital video in higher education. ViDe assists colleges and universities by helping them set up the infrastructures necessary to create and publish DV, and by encouraging schools to collaborate with one another. ViDe was founded by the Georgia Institute of Technology, North Carolina State University, the University of North Carolina at Chapel Hill, and the University of Tennessee, Knoxville, and now includes nine additional members, including the University of Washington.

What if you're not among that elite group of 120+ colleges and universities that make up the testbed for I2? When are you going to get your hands on all this cool stuff? Well, it's closer than you think. The main reason I2 hasn't infiltrated to the masses yet is simple: cash. Bandwidth has become a commodity. But cost is dropping for both network providers and users as prices for hardware and software decrease. While Internet2 will probably remain an "academics-only" network, you can be sure that once commercial providers figure out a cost-effective way to create and distribute access, a commercial-based I2-like network (probably several of them) will be available. These I2s are bound to follow a route similar to that of the first-generation Internetonce the general public and the business world have access to them, endless creative, and a good bit of mindless, development will boom.

In the meantime, you can start playing around with digital video and try your hand at broadcasting over the existing Internet. Digital cameras have dropped to less than $1,000 for quality devices, and cheap editing software like Apple's iMovie (free with an iMac) and Avid's Cinema (around $100) are offering serious bite for not much bark. Along with less-expensive versions of once out-of-reach tools, the tools themselves are much easier to pilot. What once required high-dollar, cumbersome cameras and exclusively expensive, expert-level editing facilities is now almost as simple to create as PowerPoint slides. And consider this: If you start learning now, once I2 is unveiled at her first debutante ball, you'll already know how to dance.


What You Need to Create Your Own Digital Video Broadcast

  • Record - Digital video camera with a FireWire port
  • Capture - iMac running iMovie to capture and edit the video you filmed
  • Encode - You can then use either RealNetwork's encoder or Apple's QuickTime Pro
  • Transmit - Stream your content with either RealNetwork's Server across your campus network or I2
  • Playback - PC, Mac, or UNIX workstation running the QuickTime player or RealPlayer
comments powered by Disqus