On August 6, 1991, without fanfare, British computer scientist Tim Berners-Lee published the first-ever website while working at CERN, the huge particle physics lab in Switzerland.

Berners-Lee later stated, “In those days, there was different information on different computers, but you had to log on to different computers to get at it. Also, sometimes you had to learn a different program on each computer.”

Not to be cliché, but we’ve come a long way, baby.

Berners-Lee gave managers at CERN a proposal for an information management system that used hypertext to link documents on different computers that were connected to the Internet. It was ironically labelled “vague but exciting” by his boss, and not accepted at first.

Berners-Lee teamed up with Robert Cailliau, a Belgian engineer at CERN, to refine the proposal, and in 1990 his boss gave him time to work on the project. After originally calling the project Information Management, Berners-Lee tried out names such as Mine of Information and Information Mesh before settling on World Wide Web.

Although today we talk about the Internet and the World Wide Web as synonyms, the truth is that they designate different things. When we surf the Internet, we are actually exploring the World Wide Web (WWW), one of the many services of the Internet, the network of networks.

Fast forward to 1999 when the term Web 2.0 was coined by the expert Darcy DiNucci. Tim O’Reilly and Dale Dougherty began to popularize it in 2004 as a new, more participatory, interactive and social internet. To add perspective, Facebook was founded that year, and Twitter was launched in 2006.

Since 1995, when just 14% of Americans went online, the share of U.S. residents who use the internet has climbed to over 90%.

The COVID-19 pandemic drove many commercial and social activities online and for some the Internet has become an ever more crucial link to those they love and the things they need.

As Americans turn to the Internet for critical purposes, there are rekindled debates about how the digital divide – the gap between those who do or do not have access to technology – may hinder people’s ability to complete everyday tasks or even schoolwork.

As most schools around the nation closed and classes and assignments were shifted online, some policymakers have raised concerns about how less digitally connected students will fare in this new learning environment.

For many generations, the Internet has always been a basic utility, an assumed necessity fulfilled. This isn’t true for all and even though it is only 50 years old and we have progressed, we still have leaps and bounds to go when it comes to equitable access to the technology that is no longer a perk, but an essential for thriving in school, work, healthcare, etc.

Melody K. Smith

Sponsored by Data Harmony, a unit of Access Innovations, the world leader in indexing and making content findable.