USB Type-C, the new standard that was specifically designed to magically make all our connectivity problems disappear, has finally become real and is now ready for mainstream usage. USB Type-C has been extensively talked about and developed for a good while now. The first time we heard about it was over one year ago. However, the first time we actually got to see it in action was in January, at CES. Now that Apple has launched its new high-profile, ultra-minimalist MacBook, USB Type-C has finally gone public.
USB Type-C is supposed to work successfully with virtually any type of device. And, what is even more important, the same plug can be used symmetrically, on either side of the cable. That basically means that the plug will work no matter which side is up. This solves one the most common frustrations of USB users: knowing with a glance how they should fit the USB plug. Type-C was created keeping this very aspect in mind.
However, there are many other aspects that need considering and, undoubtedly, there will be new problems that will require further attention. To begin with, users can charge their laptops using USB Type-C just like tablets and phones use USB right now. However, not all hosts are capable of delivering that much power. USB Type-C will be able to replace VGA video outputs, dedicated HDMI, and DisplayPort on numerous devices. However, it does not necessarily mean that all the gadgets with a USB Type-C could be plugged into projectors or screens.
So, if you are still wondering how exactly this newly developed standard will affect you, we can provide all the necessary answers.
USB Type-C: The background
Initially, USB 1.0 and 1.1 ports started to appear on computers in the late 90s. The new standard started to catch on quite rapidly. USB was able to provide an alternative to the parallel, PS/2, serial, and MIDI ports most PC users often had to deal with. However, one of the most tempting features was, beyond any doubt, the ability to hot-plug gadgets. What this basically meant was that users no longer had to shut down a PC in order to make hardware changes. That norm was slowly becoming obsolete. The data transfer speed was increased up to 12Mbps, a feature that was sufficient for things such as printers and other input devices.
In the early 2000, USB 2.0 increased speed to 480Mbps, thus opening the way to a completely new generation of peripherals and storage devices. This is the moment when USB became a valid standard for data transferring to and from gadgets like mobile phones and portable media players. Battery charging was an extension that followed naturally, evolving the new standard to an improved power delivery. Numerous experiments aimed at a wireless USB new standard. However, not much progress was made, mainly due to limited flexibility and power requirements.
In 2008, USB 3.0 was developed as a response to standards like eSATA and FireWire, which performed better at high speed, bi-directional data transferring. At a theoretical level, the peak speed was raised up to 5Gbps. What’s more, for the first time, sockets and new cables were required. The new USB 3.0 plugs and ports were specifically designed to remain compatible with older gadgets. This feature significantly increased their size and costs. Nevertheless, no major disruption was caused and users were not disoriented.
Only 5 years later, the new USB 3.1 was released. The peak speed was doubled to 10Gbps and it used the same ports and plugs as USB 3.0. This meant that the standard was, once again, changed. However, the connectors were still the same.
In order to make things even more intriguing, the “USB 3.1” label also applies retrospectively to USB 3.0 5Gbps connections.
Imagining our world without USB is almost an impossible task. We would still depend on specific ports and plugs for each type of device. Syncing data, charging our devices, or transporting something would simply become a nightmare caused by incompatibilities. The USB age allowed us to get rid of issues such as installing drivers and rebooting every time we need to plug a new device in the PC.
USB Type-C: A brief introduction
To begin with, USB Type-C cannot be considered a new USB version. And, more importantly, it is not a replacement for USB 3.0 and 2.0. USB Type-C simply applies to the plug itself. It simply represents a new alternative interface for USB 3.1. Keep in mind that USB 3.1 is supposed to be implemented with traditional USB cables and ports as well. There is no doubt that USB Type-C is going to be often associated with USB 3.1. However, there is no guarantee that devices that support USB 3.1 speeds actually use USB Type-C connectors. And the other way round.
USB Type-C will replace the Type-A/Type-B connectors. The initial design of standard USB included different connectors on each end, thus preventing users from making mistakes like plugging a printer into another one or, even worse, plugging into each other two power sources. This was the reason each end had different plugs.
As time passed, Mini-B and Micro-B were developed to supply increasingly smaller devices. Nevertheless, little by little, those gadgets that were initially designed to be targets were endowed with host devices features. Among other things, users needed to plug in pen drives into a smartphone, print smartly from a storage device, or use a touchscreen tablet as a control surface. Through a Type-B port, USB On-the-Go (i.e. OTG) became capable of allowing other target devices to be hosted. Nevertheless, dongles became a necessity due to the different shapes of ports.
USB Type-C will allow us to plug virtually anything into something else. In theory, gadgets will be able to sense each other, establishing immediately what to expect from the other one as far as control, charging, and data exchanges are concerned. Incompatible products will simply not work. Users might still feel frustrated, but hopefully less.