Feeds:
Posts
Comments

Posts Tagged ‘#computers’

Minority Report

The film Minority Report: An early example of Augmented Reality

A History of Augmented Reality (circa 2035):

In the late 20th century tech and entertainment companies began toying with the idea of “Augmented Reality,” which is just another way of saying “enhancing our perception of the real world.” Early occurrences of AR were commonly found in the sports and entertainment world. The yellow first down marker displayed on telecasts of NFL football games, and sponsorship logos blazoned on the outfield grass during baseball games were two common examples.

In the early 21st century the prevalence of smartphones with built-in cameras, GPS systems and accelerometers, put primitive AR programs in the pockets of millions of Americans. An early example of one such AR mobile program is the Urbanspoon app. The “scope” function of the application allowed you to point your camera at the horizon to view all the restaurants nearby, as-well-as the distance, price range, type of cuisine and popularity, all displayed over a real-time image of the horizon you are scanning.

But early AR was cumbersome. Using it meant holding your smartphone in front of you as you used the built-in camera, severely limiting your ability to multitask. Subsequently AR was only used occasionally and seen more as a gimmick than a revolution.

In the mid 2010’s  an enterprising company marketed a bluetooth peripheral visor for use with cellphones. The first system used a small front-facing camera that fed images to small screens inside the visor. This allowed for a hands-free, always-on AR, also known as a Heads-up Display (or HUD). The 1st generation visor sold well, but was plagued by several issues. The battery life was terrible, lasting less than two hours of continuous use before the batteries needed to be replaced or recharged. Unfortunately, most people suffered severe eye strain and gave up well before they reached that mark. There were also complaints that the visor was heavy and hurt the ears and the bridge of the nose if worn too long.

The company came out with two more versions of the visor. The second generation visor had two front-facing cameras, slightly offset and left-eye/right-eye screens inside to provide depth perception and a sort of 3d viewing ability. This model was plagued by the same issues as the first generation: battery life, eye strain and physical fatigue from use, as well as vertigo and nausea. The third generation visor included several panoramic cameras and a screen that covered the entire inside of the visor to allow for a greater range of peripheral vision for the wearer, but unfortunately public interest had waned following the two previous versions and sales plummeted.

Courtesy of Blutsbruder Team

Artist's Rendering of OLED Glasses

Had the panorama visor been a success, it’s market dominance would have been short lived, because later that decade a tech company formed by MIT graduates hit the market with transparent OLED (Organic Light-emitting Diodes) eye wear. The first gen OLED glasses were not completely transparent and functioned very much like sunglasses, even when they were powered off. Like the AR Visors, the transparent OLED glasses used bluetooth to connect to a smartphone, but the similarities virtually ended there. Since the OLED glasses were see-through there was no longer a need for a front-facing camera, let alone one that would be running constantly. Eliminating the camera made the glasses much lighter and reduced battery drain exponentially. Each lens was a separate OLED screen, but the translucent nature eased eye strain dramatically.

Unfortunately there were still drawbacks. The early lenses were large – about the size of old aviator sunglasses – and the were heavier than your average pair of specs. They were also virtually unusable to those who already wore corrective lenses (excluding contacts, of course). The company did market a “clip-on” version of their product, but it was cumbersome and made regular glasses very front heavy. However, as OLED technology advanced the lenses became thinner, lighter and – most importantly – more transparent, eventually culminating in Ultra-thin Transparent OLED screens that were roughly 1-2 mm’s thick. This allowed OLED screens to be fitted over regular corrective lenses to bring OLED HUDs to everyone.

It was around this time that the government was getting heavily involved in funding and developing AR systems. The military had previously experimented with AR display technology with devices such as these goggles which marine mechanics could wear to assist in vehicle repair. But now they were looking for a more convenient, universal Heads-Up Display receiver that could be utilized in the field. The Ultra-Thin OLED screens came in quite handy.

Typical Video Game HUD: Strangely Prophetic

The military was able to marry proprietary software with updated hardware to create the type of real-time HUD you might see in a turn-of-the-century video game. It was bundled with an earpiece and a microphone. Suddenly soldiers had access to vital information without having to take their eyes off the battlefield, and were in constant contact with their squad mates.

Aside from simple information like the soldiers direction, elevation, latitude, longitude and the amount of ammo that remained in their weapon, they were also able to pinpoint the exact location of their fellow soldiers with visual markers. This dramatically lowered the incidences of friendly fire during chaotic firefights. The soldiers could also “mark” targets with lasers, often provided by unmanned drones flying overhead. This allowed the soldiers to “see” their targets’ positions and distance, even when the target wasn’t visible to the naked eye.

In addition to these functions, all soldiers carried other pieces of equipment such as bullet sensing microphones that could pinpoint the direction an attack was coming from, and represent it visually on the HUD. All soldiers also carried mini-sonar devices that could “map” the terrain nearby. For instance, if a team of soldiers was sent to clear a building their sonar would create a real-time map of the inside by constantly bouncing signals off the walls, ceilings, floors, furniture and occupants. Some soldiers also carried “sonar grenades.” Small sonar devices that could be tossed into a room before the soldiers entered, for instance to see the number and location of anyone in the room. Soldiers outside the building could then have access to this map if they were called in as backup.

The military HUD glasses also allowed anyone in the squad (with the appropriately encrypted codes, of course) to tap into a fellow soldier’s video feed. You could see through your ally’s eyes, in case they had a vantage point to some hidden danger you could not see. The glasses also allowed soldiers to see through cameras mounted on their weapons, making urban warfare much safer. Soldiers could get a view around a corner or through a doorway while only exposing their firearm and hands to danger (as opposed to their head and face) and could shoot around the corner without exposing their entire body. They also had the benefit of having access to nightvision without having to wear the old, bulky apparatus on their faces. Nightvision cameras could now be mounted to their weapons and helmets to allow for a less cumbersome set-up.

Today we have improved on the transparent OLED screen. We use a very thin fiber optic-like film the can be applied to virtually any surface. Light is projected into the film from the edges and produces a visible image. The film can be applied to walls, appliances, windows, eye glasses, just about any reasonably flat surface. We still get our Augmented Reality information from a separate device, in most cases from our mobile computing devices.

You can customize your HUD with any number of your favorite apps, so everyone’s augmented experience is slightly different. But if you’re not willing to pay a premium for each app then be prepared to have to deal with ads. Pop-up AR ads were banned about five years ago (thank God) but you will still have to deal with advertisements that are virtually projected on almost every flat surface you can see. They’re easy enough to ignore, and you can usually download scripts to disable a lot of them if they’re really a nuisance.

Where can Augmented Reality go in the future? Well I would have to say that some day, instead of AR devices being peripherals that you wear they’re going to be implanted in your body. Tiny aural devices implanted in your ears that vibrate the bones allowing you — and only you — to hear your music, or your conversation. Perhaps we’ll have micron-thin lenses that are implanted on the cornea to allow you to see Augmented Reality images without needing to wear goggles. Or perhaps nano-projectors implanted in the eyes themselves, that project images directly onto the retina, so that you can see even when your eyes are closed. It may sound like science fiction, but it’s closer to reality than you know.

*The “Artist’s Rendering” was used without permission (but with good faith) from Blutsbruder Design. Check out their awesome site here. The goggles were not actually intended to represent transparent OLED devices.

Read Full Post »