Communications seems to be a part of all electronic devices these days. What electronic product doesn’t have a communications function? It’s hard to name one. And that trend continues as almost everything is wireless, connected to the Internet, or both.
This file type includes high resolution graphics and schematics when applicable.
Cell Phones
Smart phones dominate the consumer market, with more than 50% of subscribers now owning a smart phone. In some parts of the world, the smart-phone market is already saturated. That’s why cheaper smart phones like Apple’s iPhone 5C are in the works. Soon there will be fewer types of cell phones as the inexpensive models morph into inexpensive smart phones.
Another big trend continues to be larger screens. Virtually every smart phone has a 4-inch or greater screen with sizes up to 6 inches becoming available. That is probably the limit as tablet screens start at 7 inches. I doubt we will see the smart phone morph into a tablet, although Samsung’s Note 3 with its 5.7-inch screen and Nokia’s Lumia 1520 with its 6-inch screen are truly both.
Future smart phones may have curved screens, but who needs or wants that? Since all smart phones have become just a black slab of metal and glass, the effort continues to distinguish one phone from another.
Samsung dominates smart-phone sales with its Galaxy series, and its Note series is doing well too. The Note III is available with a smartwatch accessory called Samsung Gear (Fig. 1). Apple is in second place but still does well with its iPhone 5c and 5s models. A new iPhone with a larger screen may come along this year. And will Samsung really abandon Android for its own Tizen operating system (OS)? We shall see.
The Gear smartwatch talks to the handset by Bluetooth. It displays time, messages, and other data when you don’t want to take your phone out of your pocket or purse. Smart watches from Pebble, HOT, Qualcomm (Toq), and others potentially including Apple are gradually emerging as a major accessory for smart phones. Samsung reported sales of 800,000 units to resellers of its Gear smartwatch in two months. I might try one myself. Yet it is just an accessory like a Bluetooth headset, and it may not be the big hit that everyone expects. I predict that it will be a niche.
Another wearable with lots of hype is Google’s Glass. Many expect it to be a big hit. Wearable computers have been forecast and tested in various forms for years. But I am not sure about this one, since it does make users look dorky. It is amazing technology with its voice commands, clever screen, and constant video “spying” mode. This is another niche, in my opinion. However, Juniper Research expects the global forecast for smart-glasses shipments to reach 10 million units per year by 2018. We shall see.
Business And Standards
Microsoft’s acquisition of Nokia’s handset division will be an interesting combination like the Google-Motorola arrangement. We can expect some interesting new smart phones and tablets running Windows Mobile OS. BlackBerry, though, is slowly fading away. It’s a shame that the original smart-phone maker has lost the cachet and marketing battle and is now on its way to oblivion. It’s a waste of great technology and products.
The LTE rollout continues with all the major carriers aggressively building LTE capacity to keep up with the ever-increasing demand for video connections. Average LTE download speeds are in the 8- to 15-Mbit/s range, which is better than some wired DSL or cable TV broadband connections. LTE is becoming the norm at least in the U.S. and some parts of Europe and Asia.
Interestingly, the time division duplex (TDD) version of LTE is becoming more widely used than expected. Most of the world uses the frequency division duplex (FDD) version of LTE. Yet the TDD version requires half the spectrum and is one good answer to the chronic spectrum shortage. China Mobile is the big user of TDD LTE, and we may see many others adopt it to expedite LTE rollout and added capacity.
LTE-Advanced (LTE-A) trials are under way, and some carriers could implement small areas with this aggressive, updated version of LTE. Its carrier aggregation combines multiple contiguous and non-contiguous 10- or 20-MHz channels to provide channel bandwidths to 100 MHz enabling downlink speeds up 3 Gbits/s. Higher-level multiple-input multiple-output (MIMO) up to 8x8 is also a part of LTE-A, which should lead to improved connection reliability as well as higher speeds.
According to the ITU, LTE-A is the real 4G. It will be interesting to see how the carriers promote it, maybe even as 5G. LTE-A is mostly in a field test mode right now, although South Korea Telecom deployed its LTE-A networks last July. In the U.S., we won’t see LTE-A reach any critical mass until 2015 and beyond.
Another major cellular trend is the small-cell or heterogeneous networks (HetNet) movement. Small basestations called femtocells, picocells, and microcells will supplement the larger standard macrocells to provide improved coverage and higher speeds over a smaller area. The technology will be LTE and LTE-A as well as some 3G fallback.
Such networks do not yet exist except in trial test form, but you will begin seeing more of them later this year and beyond as carriers try to deal with the demand for greater subscriber capacity, higher speeds, and the continuing pressure on spectrum demands. ABI Research forecasts a 70% increase in enterprise femtocells in 2014 and a tenfold increase by 2018, driven by the increasing need for indoor voice coverage and video data capacity.
One factor aiding the small-cell movement is Wi-Fi offload. Data accesses can be diverted from the stressed cellular network to a nearby available Wi-Fi hotspot. That is why most microcell and picocell basestations will implement LTE, legacy 3G, and Wi-Fi. A recent survey by Maravedis-Rethink for the Wireless Broadband Alliance indicated that tier 1 mobile carriers say that 75% of their small cells will include Wi-Fi by 2018. Furthermore, the carriers expect about 20% of their data capacity to come from Wi-Fi. Added capacity and speed will come from active antennas with beamforming as well as distributed antenna systems (DAS).
So what is 5G? Small cells could be promoted as 5G, although a HetNet is still only LTE or LTE-A. One potential version of 5G uses millimeter-wave basestations, which will use the frequencies beyond about 30 GHz, minimizing the current spectrum shortage that most carriers experience as they build out their networks. The range is short at these frequencies, meaning a small-cell, short-range configuration is inherent. Antenna beam forming and beam steering will make it work even in crowded metropolitan areas.
Ted Rappaport of NYU Wireless has already proven this concept and tested it in the challenging New York City area. It is a real possibility. Something new also could show up in the years to come. You won’t see 5G for a while yet, so stay tuned as the real 5G emerges.
Short-Range Wireless
The three main short-range wireless technologies—Wi-Fi, Bluetooth, and ZigBee—continue their forward movement. We all seem to take Wi-Fi for granted, as it is virtually everywhere. Tablet and laptop users really get upset if there is no Wi-Fi connection. And while Wi-Fi could be considered a legacy technology, it continues to improve and expand.
The 802.11ac standard and certification process has been active for nearly a year. This 5-GHz-only version offers expanded data rates thanks to new modulation and MIMO configurations. Data rates can exceed 1 Gbit/s under the right conditions. However, it has yet to be widely incorporated. Plenty of silicon is now available, and you may have seen some laptops deploy it. It still has to see its way into cell phones, but that is coming. Lots of new routers and access points will need to be put into service to make its benefits emerge.
The 60-GHz version of Wi-Fi, also known as WiGig, is now available along with its product certification program. Based on the IEEE 802.11ad standard, it isn’t widely deployed due to a shortage of chipsets and its limited applications. It’s intended for video connectivity among consumer products and is showing up on HDTV sets, laptop docking stations, and video cameras. Versions to support the VESA standard and USB 3.0 are in the works. WiGig can support data rates to about 7 Gbytes/s with a range of up to 10 meters thanks to high-gain beam-forming antenna arrays. Greater adoption is expected as more IC vendors join the market.
Wi-Fi offload is also part of the small-cell movement, as mentioned earlier. Some vendors are making “carrier grade” access points that are more rugged, reliable, and capable. Maravedis-Rethink predicts that the number of Wi-Fi hotspots will more than double from 2012’s total to reach 10.55 million in 2018. And don’t forget, more airlines are making Wi-Fi available during flight to please the restless and electronically addicted. The Federal Aviation Administration (FAA) also has added new rules allowing electronic device use onboard most of the flight, though it doesn’t seem like a good idea unless you want to hear dozens of one-sided phone calls.
Wi-Fi is expected to dominate the smart appliance market as well. Washers, dryers, refrigerators, and other major appliances will increasingly incorporate Wi-Fi that can connect to home Wi-Fi networks to supply data back to the manufacturer via the Internet. ABI Research predicts growth to nearly $25 billion in this market by 2018.
Finally, last year the Federal Communications Commission (FCC) announced the possibility of adding 195 MHz of additional spectrum to the 5-GHz band for Wi-Fi use. This potential ruling lets Wi-Fi share spectrum with some government services under the guidance of the National Telecommunication and Information Agency (NTIA), the government’s spectrum regulatory arm. That will mean a great deal to the success of the 802.11ac rollout if and when that spectrum is deployed.
Bluetooth’s adoption continues to increase. The latest version, Bluetooth Low Energy (BLE), is a big hit with suppliers of medical and fitness gear. In addition, the wearable electronic trend uses Bluetooth to connect watches and glasses to smart phones. Many new watch products are now available with more to come. It’s an interesting trend, and we shall have to wait and see if it is just a passing fad or long-term movement.
Near-field communications (NFC), the short-range 13.56-MHz wireless technology, has yet to be widely deployed. It was expected to become the technology of choice for mobile payments, replacing or supplementing credit card payments with a tap of your phone. While some Android phones have adopted NFC and payment systems like Google Wallet and ISIS are in effect, overall usage has been sparse. NFC chips are not yet in Apple iPhones, and that has limited service. Its future is fuzzy.
One interesting development is the recent collaborative effort of the Bluetooth SIG and NFC Forum. Bluetooth uses NFC for pairing functions in connecting two devices, yet other opportunities will no doubt be explored.
ZigBee continues its almost quiet penetration of the industrial and commercial wireless markets. It is used in building automation, the Smart Grid, and home monitoring networks. The ZigBee Alliance recently announced the latest applications standard for the retail services area. It will enable items like personal shopping assistants, intelligent shopping carts, electronic shelf labeling, asset tracking tags, and employee customer concierge.
M2M And IoT
Perhaps the hottest trend in wireless today is the rapid adoption and application of machine-to-machine (M2M) and Internet of Things (IoT) services. M2M and IoT are essentially the same thing as they both seek to connect devices, things, and non-person stuff to the Internet for various monitoring and/or control functions.
M2M applications dominate right now with most activity centered around truck and other vehicle monitoring, inventory control, asset tracking, and physical site monitoring. The connected car will also use M2M, although other wireless methods are expected as applications such as vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) systems are implemented. The IoT or Internet of Everything (IoE) movement is in its infancy but will be used to connect appliances, home energy and security systems, and other devices that require monitoring, maintenance, or repair.
M2M connectivity is primarily cellular with special embedded cellular modems (Fig. 2). IoT connections can be cellular, but other technologies are used. The major appliance manufacturers have selected Wi-Fi as their wireless standard, though ZigBee is used in many home monitoring networks and Smart Grid connections.
For example, a home appliance with an embedded Wi-Fi transceiver would use the home Wi-Fi router to connect to the Internet and the cloud (Fig. 3). There, cloud services would provide the link to remote devices like a laptop, tablet, or smart phone. Application software would implement the desired monitor or control function. The cloud services and software are the key to a successful IoT or M2M function.
Most of the major cellular carriers offer M2M/IoT services, and multiple vendors are emerging to provide applications development, platforms for deployment, middleware, and other related services. Approximately 50 billion to 100 billion devices will be connected by 2020. But what will we do with all that data? The projections for this sector are excellent.
Infonetics projects M2M services to total $31 billion by 2017 with a total of 4 billion M2M connections. ABI Research expects the M2M market to grow at a compound annual growth rate (CAGR) of 26% from 2012 to 2016. Strategy Analytics forecasts that M2M connections will total 2.9 billion by 2022.
In a related prediction, ABI Research forecasts that global V2V penetration will reach 69% of new cars by 2027. Cars will talk to one another and to nearby infrastructure for safety and traffic congestion purposes. The most likely technology is the Wi-Fi-like 802.11p Dedicated Short-Range Communications (DSRC) standard using the 5.9-GHz spectrum. Other technologies like LTE and Bluetooth may also be used.
The use of white spaces is another major trend. White space is unlicensed spectrum comprising the unused TV channels in the 54- to 698-MHz range. These 6-MHz wide channels can be used to carry data and provide an interesting option for broadband Internet access or wireless backhaul for Wi-Fi or other wireless services.
The number and frequency of available channels varies from locale to locale, so a system of identifying useful channels that will not cause TV interference has been developed. It consists of cognitive radios that can seek out available unused channels and assess their potential for interfering with TV signals or nearby wireless microphones. The radio works in cooperation with an online database of TV stations, wireless microphones, medical telemetry, and other services that may be involved in interference.
By using advanced modulation methods, data rates of about 20 Mbits/s are possible under good conditions. However, the real value of white space is the low frequency range that permits longer-range connections and some non-line-of-sight connectivity. Instead of the 100-meter maximum typical range of Wi-Fi, white space radios can deliver connections over many miles.
White spaces are not yet widely used. Several test systems have been built and successfully deployed. Furthermore, companies like Google, Spectrum Bridge, and Telecordia have developed comprehensive databases, and others have created radios for the technology’s application. Wireless standards have not been finalized, although existing standards like 802.11af and 802.22 as well as several proprietary standards are being considered. Watch for future developments in this sector.
The critical factor in most wireless development and deployment is available spectrum. Most of the spectrum below 6 GHz has been assigned and is in common use. Cell-phone carriers crave new spectrum to expand their LTE and M2M services. They buy, sell, and swap spectrum to patch together nationwide networks.
The FCC plans an auction later this year to make additional spectrum available to those who can afford it. In another move, the FCC plans to announce the rules for and the availability of spectrum beyond 95 GHz. The technology is finally emerging to make these outer limits of frequency useable.
Wired Communications Trends
Wireless trends may dominate the market, but we cannot do without those remaining wired technologies and services. The plain old telephone system (POTS) with the vast unshielded twisted pair (UTP) infrastructure is still there and being used. Less than half of all homes in the U.S. now rely on just a wired phone, with most having a cell phone. Yet the POTS is still widely used for DSL Web access not only in the U.S. but also most of the rest of the world.
While dialup phones in the public switched telephone network (PSTN) will continue to decline, that infrastructure will still be there and continue to serve a useful purpose. The FCC and even some of the main telephone carriers have proposed a phase-out of the PSTN. It may seem inevitable but will take many years with much upheaval to accomplish.
Cable TV systems still lead the way in Internet service in the U.S. These hybrid fiber coax (HFC) systems run fiber to a neighborhood node and then connect to individual homes and businesses with coax cable, usually RG-6/U. These systems can deliver high-speed Internet service with rates to 50 Mbits/s, if you really need it and are willing to pay for it.
Most cable TV television, voice over Internet Protocol (VoIP), and Internet services are delivered over cable by a system called the Data Over Cable Service Interface Specification (DOCSIS). Developed by Cable Labs, this ITU standard is used in the U.S., Canada, the United Kingdom, and parts of Europe. The current version 3.0 uses 6-MHz (8 MHz in Europe) channels and 64 or 256 QAM (quadrature amplitude modulation) on up to 750 MHz of coax bandwidth to deliver digital TV and Internet connections. With channel bonding (combining several 6 MHz channels), data rates to several hundred Mbits/s are possible with rates into the gigabit range if the cable company decides to use enough channels simultaneously.
Cable Labs released the latest version, DOCSIS 3.1, late last year. It greatly expands the data rate and bandwidth capabilities. Version 3.1 uses orthogonal frequency division multiplexing (OFDM) in channels that may be from 24 to 192 MHz wide on coax with a bandwidth to 860 MHz. Modulation in the channel can be 256, 1024, 2048, or 4096 QAM.
New forward error correction (FEC) and low-density parity check (LDPC) improve the signal-to-noise ratio. The new 3.1 DOCSIS-based HFC systems can easily compete with direct fiber connections as they can provide downstream data rates up to 10 Gbits/s. Version 3.1 has yet to be deployed, but look for trials this year and gradual adoption and phase-in over the coming years.
Fiber to the home (FTTH) systems are growing. The latest, Google Fiber, runs the fiber directly to the dwelling, providing a 1-Gbit/s data rate for Internet service. The Google Fiber system also can supply TV as an alternative to cable TV or DSL video services. This service is now being rolled out in Kansas City and soon will be rolled out in Provo, Utah, and Austin, Texas.
Verizon’s FiOS system is another fiber direct-to-the-home system. It has been around since 2005 and is now available in 16 states with more on the way. FiOS can be an Internet access service, a TV service, a VoIP telephone connection, or all of the above. It uses passive optical networking (PON) technology like gigabit PON (GPON) to deliver download speeds to 500 Mbits/s. PONs are inexpensive and are the connection of choice for the buildout of new cities and subdivisions.
Fiber is the backbone of the Internet. Most carriers are updating their networks to the 100-Gbit/s level if they have not already done so. Most systems have adopted dual polarization-quadrature phase shift keying (DP-QPSK) as the modulation mode for 100 Gbits/s over a single fiber. The protocol uses the optical transport network (OTN) specifications of the ITU.
This Internet Protocol (IP) system has mostly replaced legacy SONET systems. It encapsulates Ethernet packets for transport in different formats. Market research firm Ovum predicts that the global optical networks market will exceed $17.5 billion in 2018 as 100-Gbit/s systems are more widely adopted for large-scale long-haul networks
Furthermore, 400-Gbit/s systems are being implemented to further ensure that the massive demand for video over the Internet can be met in the future. Four streams of 100-Gbit/s data on different wavelengths on a single fiber gets you to 400 Gbits/s. Also, 500-Gbit/s and 1-Tbit/s systems are being developed and tested. The largest data centers (Amazon, Google, Apple, Microsoft, etc.) are already clamoring for more speed and bandwidth.
One major continuing development is the increased use of Carrier Ethernet (CE). CE includes hardware and software enhancements to standard Ethernet that allow common carriers to deploy Ethernet services in metropolitan and wide-area networks (WANs). It brings carrier-grade qualities to transport networks.
Ethernet is the standard used in virtually all local-area networks (LANs) small and large. Connecting to metro and long-haul networks meant using other technologies like SONET/SDH (synchronous digital hierarchy) or older asynchronous transfer mode (ATM) or frame relay services. With such high-volume usage, Ethernet is a very low-cost technology and very attractive as a potential candidate for longer-range networks.
Ethernet provides the simplicity, flexibility, and cost-effectiveness of its well-known protocol and components. It is a best-effort service suited mainly to LANs, though, so it typically cannot deliver the reliability of other long-haul networks like SONET/SDH. Carrier Ethernet was developed to overcome this and other objections to Ethernet.
The Metro Ethernet Forum (MEF) developed CE more than a decade ago to define standards that allow Ethernet to be used in metropolitan-area networks (MANs) and wide-area networks (WANs). It provides for greater connection resiliency and adds management services like provisioning and standard operation and maintenance functions. Overall, it implements greater provision for scalability and flexibility. As a result, CE is increasingly being adopted for Internet connections and other access in MANs and WANs.
Vitesse Semiconductor is providing CE Ethernet switches and supporting software to implement IoT connections as well as microwave and fiber backhaul for macrocell basestations, small cells, and enterprise femtocells (Fig. 4). Note how CE scales from 1-Gbit to 10-Gbit levels with standard protocols and hardware. Continued growth in CE is projected.
Software-defined networking (SDN) separates the control and data planes, allowing system administrators to configure a network of routers and switches with software to optimize connections. According to SDN Central’s definition, SDN is “a new software-centric approach to networking that reduces capital and operational cost through programmatic control of network infrastructure, facilitating customization, optimization and innovation.” SDN is not yet widely implemented, but watch for developments this year and beyond as it will redefine the Internet and other networks as we know them today.
IEEE Computer Society Experts Forecast for 2014
The IEEE Computer Society has announced its list of the top technical trends of 2014.
• Mobile cloud convergence will lead to an explosion of new services.
• The Internet of Things will evolve into the Web of Things, increasing the coordination between things in the real world and their counterparts on the Web.
• New analytics tools will emerge to handle the big data deluge, and new leaders will emerge in this arena in 2014.
• New tools and techniques will bring 3D printing power to corporations and the masses.
• Online courses demand new technological approaches.
• Mobile infrastructure must catch up with user needs and demands.
• New risks and concerns about social network privacy will emerge.
• Intelligent systems and assistive devices will advance smart healthcare.
• Agencies will attempt to tackle e-government interoperability issues.
• Scientific cloud computing will further change how science is done to help solve “grand challenges.”
To view the full details of the IEEE Computer Society’s predictions, visit www.computer.org/portal/web/membership/Top-10-Tech-Trends-in-2014.