A Learning Revolution in Texas: Bringing the World to the Classroom

On the outskirts of Austin, Texas, a small high school of relatively modest means is fundamentally changing how education is being brought to students -- with video technology that lets students debate their peers in the most remote areas of this country, or even around the world in places like Iran, Taiwan, India and Bosnia.

School districts often fail to see the significance of technology in education, according to Michael Cunningham, the principal of Del Valle High School. He is embracing technology as the future for learning because it allows for low cost solutions to the increasingly expensive conventional approaches to education.

With video conferencing, Del Valle High School (part of the Del Valle Independent School District) now participates in international educational programs with 225 schools across 75 countries.

“What the PDF has done for books, video conferencing can do for teaching,” Cunningham explained. “Right now, a printed textbook may cost $100 or $200 per copy, whereas a PDF version of the same book can be available for almost nothing.” With video conferencing, he continued, the best lecturers from around the world can be viewed in a classroom for a fraction of the cost of hosting them in person.

World-Class Experience Made Affordable

Cunningham’s quest to provide a world-class school experience to his students has been an abiding passion for over a decade. “Del Valle is essentially a lower socio-economic school district, and students don't have many of the advantages available to their counterparts in other schools,” Cunningham said. “In the 2001 school year, we learned our school district had under utilised video equipment. We put it to work very quickly, in debates with schools first from Alaska and Canada. That has morphed since then into all kinds of other video conferencing applications.”

Just this past December, Del Valle students had a video debate with students in Kherad High School in Iran. Del Valle has been conducting debates with Iranian students from the past four years – at a time when the US government was not even having formal discussions with the nation.

A “what if” debate, the students considered the issue of what might have happened in world history if Cyrus the Great of Persia had taken over the Greek city states. “It yielded some interesting discussion,” Cunningham said. “Arguably, there might not have been Christianity, the Crusades – even the Muslim religion might not have come to be. Our whole way of government may have been different. Just one or two changes in world history could have resulted in a wholesale change in the way we understand the world.”

In most cases, the video conferencing tool being used by Cunningham and his fellow educators around the world is LifeSize ClearSea, an open standard, software-based system that requires no dedicated equipment. Proprietary systems from other providers would make the process considerably more complicated. The technology enables high quality video communications over very low bandwidth, so even students in countries with limited Internet resources can participate.

For Cunningham, video technology enhances educational opportunities for students who otherwise might not have access to such opportunities.

“In Bosnia-Herzegovina, local students walked to the nearby American consulate to take part in an online video conferencing debate with our students,” Cunningham said. “Most of our partner schools from around the world are now using this technology to speak with us. Without that technology, it would be not much more than a one-sided conversation.”

Later in 2014, Cunningham is planning a moot court trial of the Warren Commission, the report from which will see its 50th anniversary in September. Also in the works, Cunningham is interested in planning an event paying tribute to the 100th anniversary of the Armenian genocide.

Redesigning Education

Underlying Cunningham’s use of video technology is his belief that current approaches to high school education are really a thing of the past.

“The traditional high school came about in the 1800s. There needs to be a complete redesign of the curricula in our schools,” said Cunningham. Video conferencing is key to his reimagining of the education process.

“A lot of schools have video conferencing capacity, but not a lot of schools utilize it,” he said. “By redesigning the curricula, we could begin to use almost our entire capacity for educating students. We could bring in tutors from around the globe, with specialists from specific areas of the world providing education on events of historical significance in that area. It completely changes the landscape for learning.”

For Cunningham, it’s a very compelling model for delivering a high quality educational experience, while dramatically reducing costs and using technology that in many cases is already in place in schools. “I can show you how you, as an educator, you can stay in your building and teach people off campus, right now, for no money. I can show you how you can tutor around the clock for a fraction of the cost. I can show you how to offer new electives without having to hire additional teachers.”

The difficulty, Cunningham acknowledges, is to be able to change the entrenched suspicion of whether technology over a decade old can actually create a fundamental change for the better in education.

“At the leadership level in many schools, there is an awareness problem of what video conferencing can actually accomplish,” Cunningham said. “There are only pockets of teachers at this point that are utilizing the technology, and if that teacher leaves, the entire concept tends to lose momentum.”

Whoever can effectively break down the skepticism and rebuild the educational system with video conferencing as a key components will be ‘a leader in the field’,” according to Cunningham. “We’re talking about billions of dollars in savings that could be redirected into making a vast difference in education.”

“Just as chalkboards have given way to whiteboards, and PDF books have replaced textbooks, so will video conferencing replace overhead projection, films, and substitute teachers or guest instructors,” he said. “With video, you can bring a world-class experience to your school for only a little money,” said Cunningham. “We are enabling our students to open their minds and see what is really going on in their world.”

This piece first appeared in Wired Magazine December 2013

The Internet Just Isn’t That Big a Deal Yet: A Hard Look at Solow’s Paradox

The Internet age has given us blisteringly fast connectivity to the World Wide Web, cloud computing, nearly instant collaboration and high definition face-to-face video communication with our peers around the world. Yet in terms of our rate of economic productivity, we have not only stalled in the past several years but also taken hugely dramatic dips. The promise of the Internet making everyone’s job easier and boosting economic advancement has not been met. Why?

The answer lies in a closer look at Solow’s Paradox. The concept was first described in 1987 by economist and author Robert Solow, who stated, "You can see the computer age everywhere but in the productivity statistics." As it grew in popularity, Solow’s Paradox became defined as the “discrepancy between measures of investment in information technology and measures of output at the national level.” In particular, it asks why the rate of productivity increase appears to be slowing dramatically in the Internet age.

And that is undeniably true. According to an early November report from the Bureau of Labor Statistics, 2014’s third quarter business sector labor productivity increased at a 2.0 percent annual rate. Output increased 4.4 percent and hours worked increased 2.3 percent. From the third quarter of 2013 to the third quarter of 2014, productivity rose 0.9 percent as output and hours worked increased 3.0 percent and 2.1 percent, respectively.

Looking at the year-over-year performance by quarter, that seems like good news. But taking a closer look, productivity actually declined steadily from a high of 8.3% in Q2 2009, including sizeable dips of -2.7 percent in Q1 2011 and – 4.5% in Q1 of 2014.

An average productivity growth of 2.0% is something people actually might notice in their lifetimes. With the trends since 2009 showing steady decline over the past several years, however, what’s really noticeable is that there seems to be no correlation between productivity and technological advancement.

For me, it boils down to the fact that, compared to the technological innovations of the last industrial revolution (electricity, automobiles, wireless broadcasting) the Internet age just isn’t that impressive. Technological advancements of the last century had a truly transformative effect over the previous industrial age. Ice farming was replaced by refrigeration, the horse and buggy by the automobile, burning of fossil fuels for energy by centralized electrical power production. These advancements were notable not just in what they achieved in themselves but how they affected society.

What’s more, consider that the average American worker’s productivity soared at an average rate of 2.7% from 1939 to 2000. Among other reasons, the productivity surge occurred because the military industrial complex went from building more weapons to building more expensive, more sophisticated ones. This was a 20th Century idea, and came into its own with the advent of the Cold War. The USA knew it could never beat the Soviet Union or China by weight of numbers, but it could beat them with more efficient systems. This caused both the arms race escalation and the space race.

In defence of today’s technology, despite its ubiquity, some industry observers believe society has really only been in the Internet age for no more than 15 years. As a result, today’s incredible advances in technology innovation have simply not had the time to effect society yet. With the every-quickening rate of change offered by the Internet, it is sensible to assume that 15 years from now we may see some more profound changes, on the order of those enjoyed across society in past periods of innovation.

That is, of course, it we take advantage of the new technological changes available to us. Right now, we are not. People are afraid of rapid change, and when technological change happens faster than the investment cycle for a technology, major problems can happen. People will dig in their heels as a reaction to the newness of innovation. They will stick to old familiar processes, even when the new ones are faster, easier and more efficient. That’s the equivalent of driving a horse-drawn carriage on the freeway.

To avoid falling victim to Solow’s Paradox over the long term, society as a whole – and business in particular – needs to think bigger. No longer can business think in terms of 10% improvement. Today’s business leaders need to radically change their business process, and look for 10 times better process. Take advantage of the opportunities offered by the cloud and gigabit Ethernet, put the applications of the Internet age to work daily, across every aspect of your operations.

That will spur a real revolution in how to do business. And only then might we be able to disprove and put to rest Solow’s Paradox.

This piece first appeared in Wired Magazine November 2014

Trade Show Robots: Defying Convention

I take a rather dim view of the new technology of service robots in the workplace. Recently, though, I’ve come across a new use for the technology that makes a lot more sense: video conference-enabled trade show robots.

Unlike robots in the office, trade show robots could be a boon for the $100 billion-plus global trade show and conference industry. Combined with video conferencing capability, these robots could dramatically change how trade shows and similar events engage attendees.

First, a little background. There is a market boom in so-called “service” robots, as opposed to “industrial” robots like the ones used on assembly lines. The International Federation of Robotics estimated the worldwide market for robot systems in 2012 at $26 billion. Service robots, meant for personal or professional use, saw a 20 percent increase in 2012 sales over 2011. From 2013-2016, that segment alone will likely have a market value of $ 17.1 billion.

How does this play into the trade show world? In October 2013, Suitable Technologies made 50 of their Beam “remote presence” or “telepresence” robots available for rent at the RoboConference in Silicon Valley. In a conversation with a trade editor, Suitable’s founder Scott Hassan suggested he could have 10,000 of his robots at the 2015 Consumer Electronics Show (CES).

It’s not such a far-fetched notion to have robots roam the trade show aisles. Some show goers are already renting Segways to cover more convention area quickly. This would just be removing the human element altogether, and replacing it with so-called robot “avatars” already making a splash on the market.

Robot builders originally suggested a far less practical business model, which was to use the technology in the workplace so traveling workers could have the same kind of access as when they’re actually in the office. I just can’t see the market for very many business robots, zipping around on their little robot wheels, popping their little robot video heads into offices. It’s hard to justify more than a couple of these service robots in even a very large company, and that’s not a growth strategy.

Trade shows, though, now that’s another thing altogether. The economics of using robots makes sense there. When CES rolls around, it’s hard to find a hotel room in Las Vegas for less than $400 per night. Add the travel costs, and it becomes prohibitive to have a group of attendees at the show. That cost is even greater for international travellers. Robots would eliminate that expense, or at least reduce it greatly.

And with telepresence robots (that is, robots with video conferencing capability), you’re not limited to one attendee per robot. Today’s video conferencing technology allows for multiple shared calls on the same device. An entire group can attend a conference on a single robot, with eight or nine shared views. Participants can drop in or out at any time, depending on their interest in what’s happening at the time.

All this begs the question of whether exhibitors and show coordinators will have to create robot-friendly environments. It may no longer be practical to have raised platforms in booths and exhibit areas. Printed information like brochures or show guides may be gradually phased out in favor of scannable QR codes to download the information right through the robot.

If robots catch on, we can expect shorter cafeteria lines as more attendees opt for the virtual experience. (On the other hand, it will put a crimp in concession revenues -- and a real dent in the promotional gift market. Expect substantial sales drop-off in stress balls, t-shirts, mints and other promotional items when robots roam the floors.)

Of course, not all show organizers would want to manage 10,000 robots. That opens up an entire new potential line of business, which I’m dubbing “fleet robotics.” Suitable’s Scott Hassan has the edge on the thinking here. Fleet robot rental companies could provide and manage large numbers of robots for trade show or event companies that don’t want the hassle of their own service robots.

It’s not unlike rental car companies that offer fleet services to other businesses. And it opens the robot market to a company like Suitable to supply new channels for their products -- they can sell direct, offer fleet services themselves, or create inventory for an emerging fleet rental market.

Believe me, that market is likely to happen. If we’ve learned anything from Arnold Schwarzenegger, it’s that robots are pretty hard to stop.

This article first appeared in Wired Magazine January 2014

Why Online Learning is More Valuable Than Traditional College

Distance learning -- especially given the advances of high definition video conferencing -- has long been touted as a way to augment conventional educational techniques. Increasingly, though, it’s worth considering whether it might even be a valuable substitute for a traditional college degree.

Consider the skyrocketing costs of college tuition. For the last 30 years it had been taken for granted that you got a college degree to earn more money throughout your career. Now, however, the student debt problem has escalated dramatically in America. The Federal Reserve Bank of New York reports $902 billion in outstanding nationwide student loan debt, while the Consumer Finance Protection Bureau estimates the figure at $1 trillion.

With the rising cost of university tuition in America, it's fair to question whether the additional income you might earn with a college degree may actually offset the cost of the loans you need to pay for that degree. Perhaps the answer to the student debt problem is a better embracing of online education.

Online colleges have existed for decades, of course. But I’m not talking about these for-profit universities. I’m talking about an atmosphere of free learning from the best available lecturers. The Khan Academy, for example, is a non-profit organization with a goal of providing a free world-class education to anyone, anywhere. To date, it has offered over 300 million lessons. Its YouTube channel has more than 283 million total views. By comparison, MIT’s channel has 52 million views, and less than half of Khan’s 1,233,000 subscribers. That's the power of video and learning.

The downside of an online education is the absence in many cases of an accredited degree. The reason that has been a problem is because employers do not typically accept course credits over an actual degree.

I believe they absolutely should. For example, you can take practically every MIT course on iTunes University. But because you did not pay tuition, you don't get a diploma at the end of it. That's simply not reasonable for rejecting a promising job candidate in today’s business.

After all, if you take a work-related class while on the job, the certificate of course completion is really not as important as the information you learn in the class. (This is still not true when it comes to certification in certain technical specialties, such as Microsoft computer training, but that’s a topic for another column.)

And honestly, once you've been in the work environment for longer than even a year or two, who cares where you got your degree?

On the Internet, where everything is available, you have access to the best, most unique material from the world’s top scholars. That’s not true in a typical college -- and that’s where an online education becomes more valuable than a typical college degree. If I'm going to spend the kind of money that colleges require for tuition, I want to make sure that I have the best possible lecturers. In an online, open environment, you can find the finest lecturer in any field and absorb that information at your own pace. And as long as you have a connection to the Internet, you can do it for free.

This value is particularly relevant for older workers. Younger people may be willing to carry a five- or six-figure debt load related to their degree. But if I'm a 45 or 50-year-old employee, can I really afford to be saddled with a six-figure debt for a degree that will add only incrementally to my earning potential through the rest of my employable life? Will I make that money back? Probably not.

The necessity of having a degree from a traditional college seems increasingly outmoded even as it has become institutionalized. And it flies in the face of the track record of entrepreneurial giants who have risen to prominence without a degree. Steve Jobs dropped out of Reed College, then started auditing classes for free -- the same classes that would have cost him thousands of dollars a year as an enrolled student. He built his education around the fields that interested him and sparked his creativity.

These days, Apple’s career-track executive positions focus overwhelmingly on college students and college graduates. According to the company’s jobs page, “Apple is a place where students and college graduates thrive” -- not so much about opportunities for self-motivated people without a degree. If Steve Jobs were alive and looking for work today, he’d find it difficult to enter the executive ranks at Apple.

The debate here is whether a college degree is really truly worth the money. Without a degree, you may not be able to earn as much over the course of your career. On the other hand, you will not be saddled with debt. At some point it does seem to become economically unfeasible to earn that degree conventionally.

With the easy access to top quality lecturers online -- in some cases even being able to interact with those lecturers in real time through two-way video connections, what difference does a degree make any longer, as long as you're getting the knowledge?

This piece first appeared in Wired Magazine Jan 2014

Language and the Interconnectedness of Things

The Internet of Things has spawned more than just an increased infiltration of web technology into our day-to-day lives. It has introduced a much more connected experience among users of web technology every day -- let’s call it the “Interconnectedness of Things.”

That, in turn, has made it more important than ever that we appreciate the benefits of a common means of communication in science, technology and business. For what it’s worth, that common means of communication is (at least for the foreseeable future) the English language.

Despite the technological advances attributable to China and Russia, English is still the de facto language of science and business. As far back as 2008, Research Trends magazine noted that English is the first language of about 400 million people in 53 countries, and the second language of as many as 1.4 billion more. English, the magazine contended, is “well positioned to become the default language of science.”

As for business, a 2012 Reuters news agency survey conducted by Ipsos Global Public Affairs showed that more than two-thirds of employees of 26 nationalities who deal with people in other countries use English most often.

"The most revealing aspect of this survey is how English has emerged as the default language for business around the world," said Darrell Bricker, CEO of Ipsos.

What does that mean in today’s interconnected age? Common communications skills create more options for more people, and yield better economic prospects for people with those skills. A common means of communication enables assimilation and creates positive changes in the culture of science and business. Distance is no longer an issue when the Interconnectedness of Things allows us to employ that common language to take full advantage of the Internet.

Breaking all of society into smaller lumps with no common means of communication is disadvantageous to progress in science, technology and business. Think of how much poorer your Internet experience might be if you could only read Russian or Chinese. It would be a more skewed experience with a much more limited point of view. That’s already been proven to some extent by the Chinese censorship of the Internet. You simply can’t fully appreciate the Internet without English.

On the other hand, resorting to a common language is also possibly damaging to the cultural identity and sense of heritage for non-native English speakers. According to the MIT Indigenous Language Initiative, approximately 6,000 languages are spoken around the world. Of those, they say, only about 600 are “confidently expected to survive this century.”

That is a real tragedy, but the plain fact is that affluence is tied to common language. If it were not, this trend toward English as the default language for science and business would just not be happening.

So, what’s to be done about language, culture and progress? In an ideal world, we’d all learn to use one language for science, technology and business, and learn, respect and use others for cultural identity and a sense of community -- especially in our polyglot nation.

That requires some flexibility in how languages themselves are developed. We need to be more adaptable and sensitive to other cultures as we use language.

Some languages, however, seem institutionally disposed toward inflexibility. For example, L’Academie Francaise protects the French language, allowing only a few new words each year to enter the lexicon. A commission of the academy’s members (known somewhat supernaturally as “the immortals”) regularly publishes a dictionary of the French language, considered to be the “official” usage guide.

That’s different from the way in which the Oxford English Dictionary, for example, provides its updates. Slang, new interpretations of established words, and even new concepts seem to be embraced rather than limited.

In my opinion, an overly academic approach to language has stunted the growth and evolution of French as the lingua franca (irony) of business and science. Not that French is doing all that poorly -- it is an official language of many international organizations including the United Nations, the EU, and NATO. And in 2011, Bloomberg Businessweek proclaimed French to be one of the top three most useful languages for business, behind English and Chinese.

But it’s this rigid approach to monitoring the language that is keeping French in the third spot, despite France’s remarkable history of scientific advancement. To some extent, French speakers are taking note. The official L’Academie Francaise dictionary is increasingly disregarded by users, in favor of language that has naturally fallen into common usage.

That’s for the best. The more flexibility there is in allowing a language to change and evolve, the richer it becomes. The richer it becomes, the more accepted it is as a common means of communication. And the more common a means of communication it becomes, the more it contributes to a connected experience -- the real endgame in the new Interconnectedness of Things.

This piece first appeared in Wired Magazine October 2014

How Millennials Are Driving 2015 Tech Trends

This time of year, just about every industry analyst and technology vendor trots out his or her predictions for top technology trends to watch next year, and I’m no different.

This time, though, I’ve also sifted a bit through what other industry observers have to say. Interestingly, the predictions nearly everyone is making stem from a reality in the workplace – namely that Generation Y and Millennial employees have certain expectations for technology in the enterprise, which savvy vendors are meeting.

The emerging trends for 2015 embrace expanded use of video in Unified Communications, Infrastructure and Software-as-a-Service, and low-cost computing devices. Let’s take a closer look at each.

Unified Communications (UC), Video and the Cloud

In October, Frost & Sullivan released a report on the “Office of the Future,” noting that Millennials will make up 75% of the US workforce by 2025. An average user may utilize four devices per day, the report said. So the workforce is becoming increasingly mobile and the work environment itself is becoming smarter than ever. That means next generation technologies need to be introduced into the UC concept, just to accommodate the expectations of Millennial and Generation Y employees.

Another analyst firm, Ovum, also places Unified Communications on its list of 2015 Trends to Watch. Noting that cloud-based communications services are transforming UC, Ovum suggests that better integration and lower price points will continue to drive enterprise adoption of video into this service.

In my experience, businesses are rapidly moving to cloud-based solutions for enterprise communications. Video is not only becoming part of every conference room; it’s increasingly at the fingertips of nearly every employee. Now that public cloud offerings like Amazon Web Services and the IBM Cloud have the bandwidth, stability and availability to be practical business solutions, employees can reliably and securely meet over video wherever they happen to be – whether on their laptops, tablets, or smartphones.

Infrastructure and video software delivered via the cloud is dramatically changing the UC framework, creating lower-priced, highly scalable solutions for a connected experience among employees, vendors, partners, and customers.

Computing Everywhere, IT-as-a-Service

Building on the idea above, the analyst firm Gartner in October released predictions for top IT strategic trends in 2015. Like its competitors, Gartner saw a greater emphasis on mobility, noting that the computing environment “will need to adapt to the requirements of the mobile user.”

Part of that emphasis for business was what Gartner calls “Computing Everywhere,” which it connects with “Cloud/Client Computing and Web-Scale IT.” As cloud and mobile computing converge, industry will see the growth of “centrally coordinated applications that can be delivered to any device,” Gartner noted.

The maturation of Infrastructure-as-a-Service technology is leading to more and better Software-as-a-Service enterprise applications for the enterprise. These cloud offerings are redefining how we think about team collaboration and document sharing.

These days, just about any company can put video IT infrastructure in the cloud, and it can be of huge business benefit. Security issues are dramatically reduced, which lets companies make the best use of the technology on whichever device an employee happens to be working.

By delivering and maintaining IT as a service in the cloud, you’ve nearly done away with expensive, complicated on-premise server technology. Combined with less need for complicated security protocols, collaboration within and among companies is much easier. As the year goes on, we’ll start to see an even greater variety of cloud offerings, such as the rise of real-time communication applications. By that point, computing is truly going to be everywhere.

Low-Cost Computing in the Enterprise

Businesses are starting to look more closely at ultra-low-cost computing devices, like Chromebooks. According to another report by Gartner this past summer, Chromebook sales could triple to 14.4 million units by 2017. This year’s ales of the devices are on track to top last year’s by a whopping 79 percent.

Isabelle Durand, principal analyst at Gartner, said that "by adopting Chromebooks and cloud computing, businesses can benefit; they can shift their focus from managing devices to managing something much more important — their data."

They’re not only managing their data better, they’re also managing their people, productivity and communications better. One big reason is that end points are changing dramatically. These low-cost computing devices can lead to dramatic cost reductions for video and video-related applications. With a Chromebook PC and a quality pan/tilt/zoom camera, a business can cut the cost of video in laptop communications by up to 80 percent per end point.

And let’s not kid ourselves – Millennials and Generation Y employees are driving the rapidly growing interest in Chromebook technology for businesses, at least to some extent. Chromebook sales were predominant in the education market in past years, and the students that swore by them a few years ago are now today’s up and coming corporate workers.

That’s why so many predictions about technology trends in the coming year have an overt focus on video. The young people that will account for the majority of the labor force in just over a decade are not only very comfortable with video, they prefer it as a means of engagement.

We’re at an interesting crossroads, technologically speaking. Increasing demand for video, reduced cost for infrastructure, improved software capabilities and dramatically more affordable end points are all being pushed by a very real demographic shift in the work force, and how these workers interact with technology.

This piece first appeared in Wired Magazine February 2015