Messages in the Deep

The Remarkable Story of the Underwater Internet

The Underwater Cold War: Operation Ivy Bells

In October of 1971, in the midst of the Cold War, the nuclear-powered submarine USS Halibut entered heavily guarded Russian waters in the Sea of Okhotsk. Orchestrated by the joint efforts of the CIA, NSA, and U.S. Navy, her mission—code named Operation Ivy Bells—was to find and tap into an undersea communications cable that was connecting a Soviet naval base on the Kamchatka Peninsula with the Pacific Fleet’s mainland headquarters in Vladivostok. The cable was a vital pipeline for Soviet communications, and as such represented a tremendous potential intelligence resource for the United States.

In 400 feet of water, the Halibut’s crew found the cable and installed a unique waterproof tap that was designed to detach and remain on the seafloor if the cable was ever hauled up for repairs. Each month for the next decade, the navy secretly returned to retrieve the recordings and install new tapes—until 1981, when NSA employee Ronald Pelton sold classified information about Operation Ivy Bells to the KGB for $35,000.

USS Halibut
USS Halibut
Image: US Navy

Pelton failed to cover his tracks; he was exposed, convicted of espionage and continues to this day to serve out his lifetime prison sentence. Although his betrayal compromised the Ivy Bells operation, the United States’ success in the Sea of Okhotsk prompted them to carry out many more underwater cable-tapping operations in the years thereafter. In 1979, the USS Parche traveled from San Francisco to the Barents Sea to install another tap on a Soviet undersea cable, which ultimately remained undetected and in-use until 1992. In 1985, the navy expanded their operations to the Mediterranean, tapping cables spanning from Europe to North Africa. The USS Parche remained in active operation until 2004, receiving numerous presidential commendations throughout the 1990s for her many classified missions around the world.

USS Jimmy Carter
Image: US Navy

Today, Parche has been replaced by the USS Jimmy Carter, which is reputedly retrofitted with a special floodable chamber that allows divers to move freely between the interior and exterior of the submarine during underwater operations—a feature that many have speculated is used for modern-day undersea cable tapping.

Since the days of Cold War-era spying, underwater communication cables have proliferated exponentially across the globe. According to data released by TeleGeography, leading telecommunications market research firm, there are 277 undersea fibre optic cables in the world today. These cables carry 99% of all international communications, including Internet and telecom traffic. They span a total of 986,543 km, and each day route a quantity of data equivalent to several hundred US Libraries of Congress. This mass expansion of communication across the globe has forced surveillance methods to adapt and evolve accordingly.

Below, you’ll find an interactive map charting the growth of the modern fibre optic network between 1989 to the modern day, and including cables to be completed in the next few years. You can select a country to see all connections to and from that nation, or specify a provider to see the parts of the network owned by that company.

2014

Choose Year

Total Cables Built
0
Total Cables Length
0
Select Year Cables Built
0
Select Year Cables Length
0

Special thanks to Telegeography for making this data public.

Submarine Surveillance: Fibre Optic Tapping

Back in the 1970s, Operation Ivy Bells was likely the first underwater cable-tapping operation in history. The operation took place when copper wires, which radiate electromagnetic energy, were used to relay relatively small amounts of data that could be non-invasively intercepted by fitting a recording device around the cable. At the time, the Soviets were so confident in the security of their line they left it completely unencrypted, enabling US intelligence teams to simply record the transmissions, extract them from the seabed and listen to them at the end of each month.

In modern cable systems fibre optic cables use photons to transfer data. Modern tapping can be accomplished in one of two ways: either by splicing the cable and splitting the photon stream with a prism, or by bending the cable to a point where it begins to leak data. The documents leaked to The Guardian by Edward Snowden in 2012 revealed how British and American intelligence agencies had tapped more than 200 of these cables as part of an ongoing mass-collection spying project launched in 2008, completely undermining the privacy of ordinary citizens across the world. The Guardian revealed how British intelligence agency GCHQ was intercepting data at a rate with the potential of collecting the equivalent of “all the information in all the books in the British Library 192 times every 24 hours”.

Public responses to these revelations have been far reaching. Reuters reports that the EU has threatened to suspend data transfer agreements with the US until Washington strengthens guarantees to protect the privacy of EU citizens. French telecom provider, Orange, claims they will soon sue the NSA for illegally tapping their undersea cables. It has also been reported that Privacy International, a British activist organisation, recently filed a lawsuit against the UK government for what it deems unconstitutional spying.

NSOC-2012
NSA Headquarters, Forte Meade
Image: National Security Agency

Over 80% of international fibre optic data from Latin America currently routes through the United States, meaning that even legislation passed in other nations would be largely impotent in stemming off the intrusive governmental data collection. Brazilian President Dilma Rousseff announced plans this February for a new $185 million transatlantic fibre optic cable linking her country directly to the EU, which she claims will “guarantee the neutrality” of Brazilian Internet traffic by circumventing US soil; it remains unclear how it will be protected from US or British tapping.

The questionable security of the planned Brazil-to-EU cable underscores a much wider issue in the network of fibre optic cables that traverse the globe and support nearly the entire infrastructure of worldwide Internet and telecommunications. The cost and logistical demands of patrolling such expansive systems is simply unfeasible—particularly without international governmental support. Even if private companies did have the resources and motivation to protect their cables from surreptitious tapping, they could still be compelled by governments to allow it. In fact, as The Washington Post reported in 2013, many foreign and domestic telecom companies with lines leading into or out of the US have already been legally compelled by the FBI, FCC, and Department of Homeland Security to provide full access to their fibre optic cables.

Keeping the World Wide Web Connected

Undersea cabling has been in use since the 1840s. In the early days, cables were engineered to enable transatlantic telegraphs, and where then developed to carry telephone calls and faxes. Today, in the region of 200 fibre optic cables make up an undersea network that transports Internet data around the world, carrying more than 95% of transoceanic voice and data traffic.

Underwater Cable circa 1939 American Telephone & Telegraph

As a growing number of nations ‘log on’ to high speed bandwidth and the world’s dependency on the network increases, these expansive cable systems and the industry responsible for its functionality have never been more important. The subsea cabling industry is, however, one that the majority of service users are only vaguely, if at all, aware of, despite the significance of its role in keeping us all connected.

Much of the industry day-to-day is focused on planning and maintaining cable systems. The condition of a system is continuously monitored by equipment installed in cable landing stations, manned sites typically located where cables come ashore. Fibre optic cables are about the same diameter as a garden hose and are buried only a few feet below the seabed, leaving them susceptible to a variety of risks. The technology in landing stations detects the slightest fault in a cable – from a minor degradation in performance to a complete break – triggering an alarm to alert the technicians who are responsible for taking action.

Cable Landing Station, Bangladesh
Image: Wikimedia/Mak

Submarine cable systems are designed not to need maintenance during their working life. Modern systems are set up in a way that enables upgrades for greater traffic capacity to be made solely through improvements to the equipment in the landing station, leaving the cables untouched. External interferences, however, undermine the durability of the system design on a regular basis, and as a result a large section of the industry is dedicated to conducting repairs to cable systems.

In the early days, before burying tautened cables below the seabed became common practice, notable incidents of damage involved aquatic wildlife. Between 1877 and 1960 there were 16 recorded whale entanglements. In the entire history of submarine cabling, a total of about 40 faults have been attributed to “fish bites”, although these were mainly to telegraph cables prior to 1964. One exception was between 1985 and 1987 when a domestic fibre optic cable in the Canary Islands was damaged by sharks. Improvement to the design and installation of the cables has since strengthened the systems to the point where no further wildlife-related damages have been reported.

Despite improvements to design and engineering techniques, natural disasters such as typhoons and earthquakes remain unpredictable and devastating threats to cable systems around the world. Extreme weather events have the potential to take out large numbers of cables and cause full ruptures across different points of a cable at great depths. Coastal installations, such as landing points and stations, are also at risk, with rising sea levels, tsunamis and extreme conditions.

Neptune: “Ah-o-o-o-oy there! Get off o’ that ‘ere cable, can’t yer- That’s the way t’other one was wrecked!!!”
Image: John Tenniel, 1865.

In December 2006 communications were rocked across Asia when the Hengchun earthquake severed a whopping 80% of the cables connecting Taiwan with the rest of the world. The breakages took out half of Hong Kong’s Internet capacity and cut China’s access to foreign websites. The outage rocked the region’s financial institutions, stretching from Seoul to Sydney, and proved particularly damaging to the foreign exchange market. The severity of the impact highlighted how crucial the Internet has become to the structures of the global economy.

While the scale of damage caused by natural disasters is by far the greatest, the most prevalent incidents of breakage are invariably a result of human interference. A particularly remarkable incident of man-made damage took place in 2011, when an elderly Georgian woman cut through a cable owned by Georgian Railway Telecom while she was digging for copper. The break resulted in 90% of users in Armenia losing their Internet connection for 12 hours.

Out at sea, the most common cause of faults continues to be trawler and ship anchors. Charting of cables has become much more advanced than in the early days and GPS has provided most ships, even the smallest trawlers, with very accurate navigation systems. Nonetheless, in the region of 70% of cable faults at depths of less than 200m can be attributed to accidental anchor damage or trawling. The industry runs cable awareness campaigns but they are not entirely effective, and figures indicate that about 100 to 150 cables per annum are damaged in this way. A trawler focused on a good catch fails to notice he is approaching a cable, a drifting ship drags its poorly secured anchor across a cable – accidents happen. Fortunately such incidents are almost always in shallow water, making them easier to repair.

Users do not usually notice any disruption when a fault occurs in a submarine cable, except perhaps a brief click on a phone call or a momentary pause during the loading of a website. To maintain this level of capacity, older cables were designed as rings with switching equipment in the system’s landing stations which automatically re-routes traffic round the other way, usually in less than half a second, if a cable is cut. This is an effective but expensive set-up, leaving capacity on one side of the ring unused to ensure immediate availability if there is a fault.

Internet_map_1024
Visualisation of a portion of the Internet
Image: Opte.org

Modern networks are designed as meshes whereby a telecommunications operator or Internet service provider has access to capacity on several cable systems and can arrange their traffic so that it is automatically re-distributed among the other systems if one fails.

If more than one cable fails and there is not enough spare capacity in the mesh, users will experience more substantial effects of a breakage, with calls failing and Internet speeds dropping dramatically – as was the case following the Hengchun earthquake in 2006. Some developing countries, such as Bangladesh, may have just one cable when they first join the international cable network and therefore have no spare capacity for a ring or mesh connection. In these cases, the effected country is forced to fall back onto older technologies, such as satellite, which offer a heavily reduced capacity. Users experience a sharp drop in Internet services and an almost total loss of international phone connections until the broken cable is repaired.

One of the real miracles of today’s international telecommunications network, apart from the fact that it works at all, is that a user does not need to worry about any equipment apart from their own phone. If you for instance call a friend in Tokyo, you do not need to think about what type of phone your friend has, which operating system or network that phone is using or any of the details about the system connecting the two devices. The call may pass through several countries and will be connected at the far end by a Japanese phone company. You will be using equipment owned by a number of different companies, but only be requirement to make a payment to a single one – your own phone service provider.

11235762823_8afa90ec98_o

A reason all this works is because all the networks, including billing systems, are built in accordance with standards produced by international organisation called the International Telecommunications Union (ITU). ITU is an agency of the United Nations. It has ‘Study Groups’ which are attended by telecommunications operators and manufacturers and look at different aspects of the telecommunications network. Despite being riven with politics, the groups successfully produce and regularly update hundreds of ‘recommendations’ that ensure everything works seamlessly (most of the time!), and that the final service user is shielded from the engineering details of this very complex network.

Most of the Internet is also built in accordance with ITU recommendations, but has in addition its own standards bodies that look at the specialised aspects that allow any computer system and software to use the Internet and enable the many different Internet Service Providers to work together. The most notable of these bodies is the Internet Engineering Task Force.

How are breaks located?

When technicians are alerted to a fault or failure in the system, tests are conducted from cable landing stations or cable management centres to identify the location of the break. There are three main methods of testing:

  • Most cable breaks expose the cable’s metallic centre conductor to the sea water, causing a short circuit to the earth. A cable’s electrical resistance per kilometre is documented during its production, enabling technicians to easily calculate the distance to a fault by measuring the resistance between the break and the landing station. This method has not changed all that much from methods used in the 19th Century. Simple, but effective.
  • Long cables, say over 300 km, will have subsea amplifiers, known as repeaters. These are typically spaced 80km apart. Each repeater has a circuit that will respond to a special signal sent from the cable landing station. If one repeater ‘answers’ and the next does not then we can deduce that the fault lies between them. Some repeaters have more sophisticated test circuits that provide information about its health or the level of the incoming signal, which can further help to pinpoint the fault.
  • When a pulse of light is launched into the fibre, some of its power is reflected back when it hits the break. As technicians know the speed of light within the fibre, they can calculate the distance to the fault by measuring the time it takes for a pulse of light to hit the fault and return.

None of these methods are exact and there is always some doubt about where the fault is located before the ship arrives on site.

How do you go about repairing a cable?

When repairing a cable in deep water it needs to be brought to the surface for the faulty section to be removed and replaced. The cable will have been laid too tightly on the seabed to be lifted to the surface; it must be cut on the seabed. The basic tool of deep water cable repairs is the grapnel, a piece of equipment designed to hook the cable on the seabed.

A ‘cutting grapnel’ is towed across the cable to cut it. This grapnel is specially designed to cut close to the seabed, thus minimising disturbance. The cable ship will usually try to cut the cable well away from the fault or break. A holding grapnel will then be used to recover one of the cable ends.

Cable repair in action

When the end is on board, the ship’s technicians will test it to see if the end they have is the side of the cut leading to the fault. If it is the non-faulty end, it will be sealed, lowered to the seabed on a rope and tethered to a floating buoy for later recovery. The ship will then retrieve the other side of the cable initially cut and test it to confirm the presence and location of the original faults.

The ship will pick up the cable until the fault is recovered and can be cut out. Further testing will be done to confirm the cable is fault-free and new cable will be jointed onto it. The ship will then lay new cable back to the location of the buoy, pick up the other end of the cable and joint the two together with a ‘final splice’ to restore the cable system. This joint will then be lowered to the seabed to complete the repair and a plough used to bury it below the seabed.

In shallow water where cables are usually buried, typically down to a depth of 1000m, repairs follow the same principle as deep water repairs but can be assisted by a submersible. A submersible is able to swim along the cable and possibly locate the fault using its cameras. It can make the initial cable cut on the seabed and attach lift lines to the cable for its retrieval.

Has this process changed throughout your career?

Installation and repair techniques have evolved throughout my career, for example the introduction of ploughs in the 1970s to bury cables, and the use of submersibles to assist with repairs. The development of GPS, radars and communications equipment helped us to navigate.

5424701055_41274288c3_o
View from inside submersible
Image: National Oceanic and Atmospheric Administration

There was a massive shift in technology in the late 1980s when copper cables with analogue amplifiers, which had been used for over 30 years, were replaced by fibre optic cables. Fibre optics provide much greater capacity at a much cheaper cost per unit. We had to develop new techniques and methods for testing and jointing the fibre optic cables, but the use of grapnels and submersibles for cable repairs remained very much the same.

Despite these technological developments and changes within the industry, if one of the early pioneers from the 1850s returned today I believe they would still recognise most of what goes on in a modern cable ship.

What are the biggest challenges being at sea?

Techniques for installing and repairing cables are well understood and have been developed and refined over the last 150 years or so. Similarly, safety standards on ships have improved greatly over the years as modern health and safety practices used in other offshore industries have been adopted by the submarine cable operators. Modern ships are more powerful and better designed than ever, but there are still limitations that cannot be avoided.

The greatest challenge remains the weather. Cable ships do not have a choice about where they work, but the improvement in weather forecasting techniques has made it possible to avoid repairing or laying cables in the worst weather conditions. Terrible weather can make the process of repairs unacceptably hazardous to personnel and risk damage to the cable itself. In these situations, there is no alternative but to suspend operations and wait for conditions to improve, delaying the completion of the repair.

Another challenge, which is becoming more important, is the increasing competition to use the seabed as a resource for oil and gas exploration, offshore energy and fishing in new areas. Cable owners can no longer assume that they are free to lay cables where they like or to repair them without regard for other users of the seabed.

Ndurance_(ship,_2012)_001
Ndurance, Cable Layer, 2012
Image: Flickr/Kees Torn

Piracy is a serious problem in many of the world’s oceans. Cable ships are particularly vulnerable as they will be stationary or moving very slowly for long periods. Before working in areas where attacks are possible, the ship owner contracts specialist security companies to advise what precautions are necessary. The ship may need to carry additional security guards or in extreme cases have a naval escort while it is working.

What is the repair process like for staff and crew at sea?

Cable repairs and installation jobs take a long time, sometimes several weeks. The worst thing is that there is never any certainty about when a job will finish. Things can go wrong during the final stages, setting back the completion of the work by several days. Having finished one job, the ship may get instructions to sail directly to a new one, without going back into port. My longest trip at sea was 33 days. Fortunately, we had a very creative chef on board who continued to produce interesting meals throughout the voyage despite his much depleted stores.

It is difficult to get much sleep in rough weather, which may continue for several days. Cable ship staff are therefore faced with doing highly-skilled work, while sleep deprived, to restore the Internet cables to full operation.

Unfortunately, the guaranteed cure for mal-de-mer, proposed by the comedian Spike Milligan: “go and sit under a tree”, is not often available to us while repairing cables.

Data is Power

The Internet has permeated 21st Century life to such an extent that its functionality now plays a pivotal role in enabling the performance of many backbone institutions in modern society – from the operations of national security departments to the global economic system.

Surveillance drone
Image: US Air Force

The significance of subsea cables for national security by no means ends with the tapping of civilian communications revealed by Snowden. The rise of connected military technologies, such as UAVs, has contributed to a change in the way the military and similar establishments operate, and has dramatically increased reliance on immediate access to data from across the world. In his report advising the US government to adopt tighter security measures for undersea communication cables, Harvard researcher Michael Sechrist comments: “Milliseconds matter when you are in the UAV business, and undersea cables shave off hundreds of them compared to satellites”. UAVs require 500Mbps of bandwidth each to function. Their missions depend massively on global network reliability, with disruptions delaying the execution of operations while also breaking down communications with forces abroad. Consequently, a cable break half way round the world can pose a substantial problem to a country’s national security activities.

“When communications networks go down, the financial services sector does not grind to a halt, rather it snaps to a halt” – Stephen Malphrus, Chief of Staff to Federal Reserve Chairman Bernanke in 2009.

Similarly, national economies are at risk of damage when cable systems are disrupted. Commercial dependency on the Internet has exploded over the last decade and is continuing to grow. Companies across all industries use cables to transfer trillions of pounds across the world on a daily basis. According to a study conducted by the McKinsey & Co, mature economies experienced a 21% increase in GDP between 2006 and 2011 associated with the Internet. Stephen Malphrus, Chief of Staff to Federal Reserve Chairman Bernanke noted in 2009, “When communications networks go down, the financial services sector does not grind to a halt, rather it snaps to a halt”.

3880462147_292411f9a8_o
Stock exchange trading floor
Image: Flickr/Kevin Hutchinson

Quantifying the cost to the economy when a nation experiences an Internet blackout is difficult, and there is little up to date research readily available. It has been estimated that a complete loss of international communications could cost a country like America over $150 million a day. A report produced by the Swiss Federal Institute of Technology in Zurich worked out the economic damage of a blackout lasting one week would cost the Switzerland 1.2% of their annual GDP. This estimate, however, dates back to 2005, and the figure must therefore be considered much lower than today’s reality; the number of Internet users across the world has risen dramatically from approximately 900 million in 2005 to almost 3 billion today, about 40% of the world’s population. According to McKinsey & Co’s report, the Internet’s total contribution to global GDP, if measured as a sector, had by 2011 already reached a point of outperforming both the agriculture and the energy industry.

An increase in Internet maturity contributes an average growth of $500 in real GDP per capita.

The Internet’s effect on national economic growth makes it a crucial factor for development in low and middle income countries. McKinsey & Co point out a link between a growing Internet ecosystem and an increase in standard of living. Their research found, through examining high income countries over the past 15 years, that an increase in Internet maturity contributes an average growth of $500 in real GDP per capita. It took the 19th century Industrial Revolution 50 years to produce the same impact. Expectations have consequently been high for the economic impact of low income countries finally connecting to the global fibre optic backbone of the Internet.

East Africa was the last major region of the globe to have plugged into high speed broadband, with the launch of SEACOM’s multi-million pound subsea cable system in 2009. The region was previously dependent on the slow, costly and unreliable service of satellites. Connectivity, as the Tanzanian President Jakaya Kiwete commented, will enable East Africans to “become part of the global economy”. South Sudan, most recently, is following suit with announcements this June of the country’s aims to connect to the international fibre optic cable system through neighboring Kenya, Ethiopia and Eritrea within the next year.

edited Nairo internet
Connecting to the Internet, Nairobi
Image: Flickr/Erik Hersman

Over the last five years, the advances in the communications infrastructure in East Africa has seen significant changes in economic opportunity and practice, but price and limited capacity in certain nations is continuing to hold back growth. Since the installation of the cable system, the fibre optic capacity of the region has shot up by over 10,000 percent. Kenya has seen mobile Internet access enable people across the country to transfer money through their mobile phones, proving valuable for the millions of citizens lacking bank accounts. Distance learning is proving a valuable education resource for people living in remote communities, providing young people in particular with new opportunities to develop skills and qualifications. Kenya is East Africa’s largest economy and privy to 2Tbps of connectivity. In contrast, neighboring Ethiopia, Africa’s second largest country by population, lags behind with a lowly 9Gbs, demonstrating a difference in national investment in the infrastructure. Tech hubs have been popping up across Ethiopia, and the region as a whole, but their success and growth is continuing to be held back by the challenge of limited connectivity. Even in Kenya, despite vast improvements in speed and reliability, prices remain relatively high compared to initial expectations, largely due to the cost of local infrastructure and maintenance.

With the increase in the number of international fibre optic cable systems in service and the rise in dependency on high speed broadband for all things from national security to locating your local hardware store, the maintenance and governance of subsea cable systems has irrevocably become a matter of global interest. In his 2010 report for the Department of Homeland Security, academic Michael Sechrist argues the importance of establishing an international partnership, bringing governments, service providers and the private cabling industry together to ensure the protection, optimisation and maintenance of undersea cable systems across the world. Concerns over data terrorism aimed at the cable infrastructure came to the forefront in 2013 when three men were arrested off the coast of Alexandria for allegedly cutting the SEA-ME-WE 4 cable – a major cable connecting the region, including much of East Africa, to the rest of world. Damage to the cable affected 614 networks connected to Telecom Egypt.

A current lack of diversity in the locations of cables leaves certain systems particularly vulnerable. Sechrist’s report speaks of an 18-inch cable pipe underneath an unprotected manhole in downtown New York which caters for the majority of traffic between New York and London. In the Middle East, the density of cabling in the Suez Canal has created a bottle neck, where cuts and breakages could prove as devastating as the Hengchun earthquake in Taiwan in 2006, which disrupted financial transactions and online commerce to the cost of millions of dollars. The physical vulnerability of the network and the potential damage a serious disruption can cause creates a need to ensure that both the public and private sectors have disaster plans in place and work towards alleviating cable pressure points, such as the Suez Canal, through diplomacy.

The Future of Connectivity

It’s easy to absent mindedly imagine the Internet as “the cloud” – a vast, ethereal accumulation of information held in the databanks of billions of computers; a collection of large and small networks connected by satellites and WiFi.

2401047587_5759f774f3_o
Fibre Optics
Image: Flickr/placbo

To some it may seem almost bizarre to think that this technology, once deemed the province of science fiction, is in fact connected using cables – a global extension of the familiar tubes that clutter our offices and the bedroom floors of teenagers. The extensive network consists of over 980,000 km of fibre optic cables buried beneath the seabed between continents, spanning the world’s seas and oceans. The mountain of data available is housed across innumerable individual networks and a multitude of data centres across the globe (Google alone has 12 data centres across America, Europe and Asia, allowing them to process over 20 billion web pages per day).

There are currently 263 active cables in place, with a further 22 planned in coming years. It is estimated that between 95% and 99% of all Internet usage is carried along these cables. The huge network, made up of cables, Internet exchange points, cable landing stations and roadside regeneration huts, carries billions of gigabytes of data every month.

In this age of connectivity, even the most mundane elements of our daily lives rely on access to the Internet. In a typical British household, you will find between three and four smart phones, a laptop, a PC, and at least one touchpad. Yet it is not simply computers and mobile devices that now demand bandwidth. From everyday items, like toothbrushes, to essential medical technology, such as pacemakers, our reliance on the Internet and our need for greater and more reliable bandwidth is steadily growing. This growing network of people and Internet-enabled devices, processes, systems and services is referred to as the Internet of Things.

Global Internet traffic in 2013 was approximately 51 exabytes (that’s 51 billion gigabytes) per month. This will increase to 132 exabytes by 2018

According to Cisco’s Connections Counter, in July 2013 the Internet of Things consisted of over 10 billion things connected to the Internet. It is estimated that this will increase to around 50 billion by 2020. All of these connections combine to generate an astounding amount of global Internet traffic. As part of their Visual Networking Index, Cisco have calculated that global Internet traffic in 2013 was approximately 51 exabytes (that’s 51 billion gigabytes) per month. They forecast that this will increase to 132 exabytes by 2018.

5745021997_7cf1fcfd8d_o
Smart Technology: the dawn of the Internet of Things
Image: Flickr/Leon Lee

Bandwidth capability will need to increase in tandem with this rise in demand to support the developments in technology and commercial opportunity brought on by the Internet of Things. In April 2014, telecommunications research and consulting firm, TeleGeography reported that demand for international bandwidth increased by 39% to 138 terabytes per second (TBPS) in 2013 – over 4 times more than the global demand in 2009 (30 TBPS). Demand is expected to increase threefold again by 2018.

Telecommunication companies and content providers alike are working on new technologies and projects to facilitate the rapidly increasing traffic and the demand for improved broadband. Giants such as Google and Microsoft are now focusing on connecting data centres to one another, rather than connecting end users to data centres. Projects are already underway to improve the capacity of the cables themselves; the new SeaMeWe 5 100G cable system will provide 24Tbs per second, considerably more than the existing cable system.

Cable chaos? The ceiling of a data exchange
Image: Flickr/Xeni Jardin

To ensure the impact of any future breakages and faults is minimised, operators are developing contingency plans which incorporate alternative connections between continents, such as secondary lines and satellite uplinks, to enable a faster reroute of traffic. Operators are also considering alternative locations for cable systems via South America and the Arctic, and encouraging the creation of terrestrial systems where possible – especially in bottlenecked waters like the Suez Canal.

Addressing concerns of sabotage and data terrorism, security companies and software developers have created both physical and cyber security systems at various points in the infrastructure to protect it from abuse, from firewall systems for data centres to cloud-based Internet intelligence and intrusion prevention systems for networks.

While the number of connectable devices outweighs the amount of people in the world, less than 40% of the global population has access to the Internet.

In addition to improving and protecting the current infrastructure, projects focused on bringing Internet connectivity to those in rural and remote areas of the world are rapidly developing. While the number of connectable devices outweighs the amount of people in the world, less than 40% of the global population has access to the Internet.

The pilot of Google’s Project Loon, developed by Google’s X Lab, was launched in June 2013, and comprises a fleet of solar-powered “smart” balloons floating in the stratosphere (approximately 20km above the Earth). The balloons connect to special antenna on the ground to deliver Internet connection to even the most remote location. The system calculates where each balloon is needed to get the optimal signal, and uses the layers of wind in the stratosphere to position each balloon to create a viable Internet network. A similar project is being developed by Facebook’s Connectivity Lab. Both Google and Facebook recently acquired companies that make high-altitude drones with the intention of using the technology to bring the Internet to the remaining 5 billion people on earth, predominantly in low to middle income countries, who do not yet have access.

In Africa, whilst 70% of the population have WiFi-enabled devices such as mobile phones, only 10% have access to the Internet. This is because many areas are extremely remote, and do not have the infrastructure in place to be connected to the Internet backbone (the submarine cabling network). Most significantly, however, a huge proportion of homes across the continent cannot afford 3G coverage.

Some African governments, such as Tshwane (the South African capital municipality), have tried to address this problem. Using the glut of WiFi-enabled devices and the explosion of new radio technology start-ups (triggered by Motorola releasing teams of engineers to start their own companies), Tshwane has begun to roll out a scheme to provide WiFi for low-income communities. By the end of January 2014, over 25,000 people had benefitted from the initiative. The scheme is to be rolled out to a further 1 million people by the end of 2014, and to a total of 3 million by the end of 2015.

Google_Loon_-_Launch_Event
Project Loon
Image: Flickr/iLighter

Bridging the global information divide created is the objective of a number of initiatives, including Outernet and Oluvus, two start-ups working towards ubiquitous and democratic access to the Internet.

Outernet hopes to enable the circumvention of Internet censorship and political control of the web by developing free globally accessible Internet connectivity. The organisation is currently developing a network of miniature satellites that will work in conjunction with existing geostationary satellites already in orbit. These will connect to the Internet through ground stations, up-linking data packets requested by a community as a whole. The packets will be broadcast in loops so that the data is continuously updated even if signal is poor.

Oluvus, set up by the campaign group A Human Right, is preparing to launch its cable-focused services later this year. The company will endeavor to provide places like refugee camps with Internet access, offering some of the world’s most vulnerable people access to information and opportunities for growth through technology.

Campaign group ‘A Human Right’ has also successfully ensured high speed Internet access reaches the 4,200 citizens of St Helena, the world’s most remote island located in the Atlantic Ocean, which has until now been almost entirely isolated from the global communication network. The campaign group, with the support of the UN, successfully lobbied for the proposed route of the new South Atlantic Express cable to be moved 500km, bringing it close enough to the island for it to be connected to the cable system.

More than 4,000 scientists in the Antarctic are producing significantly more data than they can export using the existing communications infrastructure

Improving Internet connectivity in remote areas will not only have a profound effect on the billions of people in low to middle income countries who are not currently ‘plugged in’ – it will also assist the communication of pivotal research in remote locations such as the Antarctic, which currently struggles with intermittent connection. More than 4,000 scientists in the Antarctic are producing significantly more data than they can export using the existing communications infrastructure, which relies mostly on geostationary satellites. These satellites tend to only work for stations along the coast and typically with low data transfer rates and unreliable reception.

With increasingly important data on climate change being bottlenecked at ground level, international governments, the world of academia and big businesses are working together to investigate and develop a number of possible solutions to solve the problem. These include connecting Antarctica to the Internet backbone’s extensive network of submarine cables and using new and improved satellite technology specifically designed for the task of extracting research data from the region.

Adelie_penguin_on_an_iceberg_near_the_Antarctic_Peninsula
Antarctica: The Final Frontier
Image: Wikimedia/ravas51

Using cables to connect the icy continent to the global network is likely to be less expensive than using satellites, and the fibre optic cables are capable of transferring larger amounts of data. They would also provide access to stations away from the coastline. With only three months of the year yielding high enough temperatures to provide access to the cable network under the ice, and the challenges created by the movement of the ice shelf itself (around 10 metres of movement each year), cabling has not yet proven feasible and the antarctica remains a final frontier for the submarine cabling industry.

Visions for the long term future see a network transmitting at up to a terabyte per second – that’s up to 1,048,576 MB/s.

Meanwhile, a programme supported by the Australian Space Research Programme (ASRP), Antarctic Broadband is using small-satellite technology customised to the needs of the users. Each will be placed into opposing highly elliptical orbits ensuring that the satellites will “dwell” over the South Pole for around 18 hours a day. With each satellite orbiting the Earth in opposing phases, the continent will receive continuous coverage. Using radio communication and highly directional transmitting and receiving antennas will allow for hugely increased data links.

As engineers and scientists navigate a stronger communications path to this final frontier in the South, parallel communication technologies used by top physicists and powerful research agencies offer a glimpse into what the Internet of the future may look like. While industry professionals ponder the possibility of bringing an Internet connection speed of 10 gigabits per second to the public (1,000 times faster than current rates), NASA has successfully demonstrated data transfer speeds of a whopping 91 gigabits per second with a direct connection through their shadow network ESnet. This is a rate equates to transferring the a blueray disc of data every 2.1 seconds.

Visions for the long term future see a network transmitting at up to a terabyte per second – that’s up to 1,048,576 MB/s, a rate 100,000 times faster than the connection of the average home today. Researchers from NASA and MIT have also been successfully working on transmitting reliable wireless Internet connectivity to entirely new frontiers: the moon, with hopes of providing transmission capabilities on deep space missions to Mars and beyond in the future.

It is clear that the future will hold many more new, novel and transformative ways for humans to connect across vast physical distances, as we continue to build a world in which geographical location is of scant consequence. The die is yet to be cast on how those interactions will be characterised and the longer term impact these new and powerful connections will have on the development of global society.

Even seemingly philanthropic endeavours, such as Google’s project ‘Loon’, cannot be separated from the underlying commercial value of opening new markets to the Internet.

Most recognise the web’s power to catalyse great change, to grant voices to the disenfranchised and provide those who would otherwise remain stuck in the margins with direct access to knowledge. The Internet is the carrier of our commerce, an ever expanding library of our deeds, and a great forum traversing social, political, geographical — even linguistic divides.

What remains unclear, and which fragments opinion, is the question of Internet ownership and data rights. For the firms that own the hardware that makes up the backbone of the web, and the providers who buy their bandwidth, money remains the primary end-goal. Even seemingly philanthropic endeavours, such as Google’s project ‘Loon’, cannot be separated from the underlying commercial value of opening new markets to the Internet. At the same time, many governments seek to tap this endless stream of information for the purposes of surveillance and censorship, citing the need to combat greater threats.

Fighting the battle to place internet in the hands of the people, we see projects like the Outernet who seek to ensure freedom from corporate and political influence, and the creation of a web that does not discriminate, while campaigners for net neutrality go even further, and wish to codify the values of openness on which the Internet was founded in the very laws of the land.

What the future holds for the Internet and its long term impact on the world is uncertain, but it is easy to acknowledge that its physical and conceptual form plays, and will come to play, a huge and diverse role in the lives of many, hinging crucially on the actions and decisions of countless individuals, corporations and institutions who are shaping what the network is today and what it stands to become.

Credits

Sources