01_EE_vision2020_siteDesign.jpg
03_sectionTitle.jpg
04_sectionTitle.jpg
05_sectionTitle.jpg
06_sectionTitle.jpg
07_sectionTitle.jpg
08_sectionTitle.jpg
09_sectionTitle.jpg
01_EE_vision2020_siteDesign.jpg

Title - Intro


SCROLL DOWN

Title - Intro


 

WHAT WILL THE DAY AFTER TOMORROW LOOK LIKE?

The Razorfish Emerging Experiences team is comprised of dreamers, artists and makers, and we're constantly challenging each other to craft the most compelling, seamless, and meaningful brand experiences for our clients and partners. Our most recent challenge looks to the future—the year 2020 to be exact. We’re not talking about flying cars (which we know will come sometime), but groundbreaking and practical technologies that we’ll start to play with in the very near future.

 
 
 
 

We firmly believe the best of tomorrow is yet to come and this is a glimpse inside our looking glass.

 
03_sectionTitle.jpg

Section 1


SCROLL DOWN

Section 1


 
 
 

Augmented Reality (AR) technologies have been around in one form or another for years, but it is only recently that computers and sensors have gotten small enough, fast enough, and accurate enough for it to be feasible to create AR headsets that can be mass produced.  The first generation of hardware will certainly have its limitations as all technologies do, but when AR is unleashed into the world it will nevertheless find applications and scenarios where it will flourish.

To see this, consider how we work with computers now.  When you boot your computer, you are taken to your “desktop”, you organize your “files” into “folders”, when you surf the web you see “pages”.  I put these terms in quotes because these are all metaphors that we borrowed from the real world to describe abstract concepts in computing.  You don’t really push buttons on a touch screen—you tap a region of the screen and the system responds by updating the display to show a pushed button and finally a button at rest.  Even 30 years into the personal computer revolution, and even after the first digital generation has now grown up, we still rely on the physical world to understand how to work with computers.  The benefit that AR has is that it no longer requires us to use these, often times contorted, metaphors to interact with a computing platform.  Instead AR celebrates the physical world that we understand so well by simply adding to what we already see and AR it will thrive in solving problems grounded in the real world.

Consider the problem in Flint Michigan where lead is now contaminating much of the city’s drinking water.  This contamination is due to corrosion that has occurred on CNL (copper and lead) pipes.  The most direct way to fix this problem is to replace the corroded pipes, but there are probably hundreds of miles of service pipes even for a small municipality, segments of which have been built over the entire course of the municipality’s existence with pipes made of different materials.  Which segments do you replace?  Where do you start digging?  Imagine if you could put on an AR headset and just see the pipes that are underground, right through the soil that buries them, like x-ray vision.  Imagine if you could see the individual segments that were corroded.  The amount of time that AR could save in such an endeavor could literally change lives for the better.  To be fair, simply having AR wouldn’t completely enable a solution where you can see through soil to the pipes below; very accurate map information would also be needed, but it does illustrate how AR enable near-magical powers in the real world.

Augmented reality would be no less magical in marketing.  Consider a personalized home builder that builds standard spaces, but then allows the buyer to specify what types of cabinets and counter tops to use in the kitchen, what type of floors to use in the den, what color paint to use on the walls, and what types of tile to use in bathroom.  An AR experience that allowed prospective buyers to see the property, not as it is now, but as they want it when they move in, and could have an enormous positive emotional impact on the buyer.  This is an example of where AR can have a tremendous impact when someone is buying what is likely to be the largest purchase in their life—AR could be just as powerful on the other side of the buying spectrum.  Consider a liquor manufacturer that wants to elevate its brand from simply “maker of premium booze” to a lifestyle brand.  AR could be used to build an experience that helps a party thrower make mixed drinks for guests by calculating the volume of a serving glass and then super imposing pour levels for various ingredients.  More involved drinks would of course require more detailed instruction but the point is the liquor manufacturer start to solidify itself as a lifestyle brand by enabling one to live the lifestyle they want to embody.

These applications are really just the tip of the spear.  Augmented Reality enables an entirely new class of computing platforms. In just a few years you will see AR configured, IoT based, personal area networks; AR based, multidimensional data analysis systems; and, through the integration with object recognition via advances in computer vision, open computational environments.

 
 
04_sectionTitle.jpg

Section 2


SCROLL DOWN

Section 2


 
 
 

Connected Platforms & IOT Enable Incredible “If This, Then That” Moments

As networks and technologies advance, real-time experiences leveraging big data will continue to connect drivers with their surroundings. For example, Qualcomm’s Snapdragon Automotive Solutions offer a range of connectivity solutions: Vehicles-to-Vehicles (V2V), Vehicles-to-Infrastructure (V2I) , Vehicles-to-Pedestrian (V2P) and, of course Vehicles-to-Cloud (V2C) communications. From BLE (Bluetooth Low Energy) to NFC, RFID and more, the integration of connective technologies across devices, objects and environments enriches the connected driving experience. As a result, expect to see huge improvements in areas such as personalization and even safety, in ways that could make the all-too-familiar auto accident a thing of the past.

When vehicle data moves from primary to secondary use, its value is extended over time. Cars are laden with chips, sensors and software that upload performance data to the computers when the vehicle is serviced. Secondary uses of this data will give competitive advantages to those in adjacent sectors to broaden their own value propositions. And the best practices and lessons learned from how auto marketers leverage consumption data will be instructive for all marketers.

 

 
05_sectionTitle.jpg

Section 3


SCROLL DOWN

Section 3


 
 
 

AR – Augmented Retail

In 2020, Magic leap’s releases their 2nd generation of ar glasses, glasses that finally find consumer consumer acceptance in their attractive styles and practically invisible technology. A little bulkier than a standard set of frames, these glass deliver on the everyday promise of practically 24/7 AR. While the first generation was a success with gamers and businesses the original goggle like format was too distracting to everyday users, but this is a new day. Amazing jumps in battery, computing and wireless technology now allow us to experience a fully mixed reality. This experience is extremely useful in retail, allowing customers to identify products there are interested in at a distance, try on clothing without removing any clothes, analyze sales people’s honesty with emotion recognition, “oh really? Does this really look fantastic on me?.”  People making more considered purchases can see the invisible as they gather information about the object of their desire. How much is this jewelry worth, is it really 24 carat? Who did the cut? Those lucky enough to own the new glasses will flock to retail to get the best of both the online and physical retail experience in one.

 

Headline from 2020:
Amazon delivers its millionth package via drone in just 12 days

In its first official roll out amazon deployed 2,000 drones across 500 suburban neighborhoods who agreed to allow the service in 2019. Amazon is working towards its goal of moving 25% of its delivery service to autonomous drones by 2025.  Amazon Now Air has a perfect safety record despite all of the initial concerns; this safety record comes at the expense of reliability and delivery times though with many weather groundings and longer flight paths to avoid travelling over people’s homes and populated areas. The service is constrained to small items under 5lbs, but is very popular with people who understand the services limitations.  Many users experience deliver times of under an hour for popular items. Amazon also announced that it has cut its greenhouse gas output by 5% to date because of the electric based UAV fleet. Each drone is currently making about 12 trips a day serviced from a local warehouse or storage container based distribution point outfitted with a technician and a battalion of battery charging equipment.

 
 

Dimensional UI

For the first couple years of the AR/VR revolution, UI’s were a horrific mess, from un-navigable incredibly complex Iron Man style interfaces to overly simplistic 2d menus that reminded us of in car interfaces from the early 2000’s, dozens of clicks to change the car’s clock. But now in 2020, 3D ui has found its sweet spot. The gradual evolution of the right blend of beauty and usability as a new generation of experience designers and artists, put their ideas in front of originally thousands and now more recently millions of users. The technology evolved to as the field of view of AR and VR devices has gone from postage stamps to full eye vistas. Sophisticated computer vision algorithms and eye tracking have escalated natural gesture based communication to the point that many people prefer point at things to control them to saying it out loud to their voice based AR assistants.

In 2020, Microsoft’s holo-office software is released, providing a natural next step for Microsofts return as a digital business powerhouse.  With Holo-office users can work together both remote and in person with equal experiences. Wearing a version 2 hololens in VR mode, workers can share the office space with their non-virtual co-workers. Remote workers appear at their virtual desks for those wearing a hololens in AR mode in your physical office. Workers can collaborate on documents everywhere and anywhere with a virtual screen that all participants can see.

 

Headline from 2020:
The hospital’s new eyes

For years the medical community has used this graphic to remind doctors and nurses to be sensitive to their patients discomfort. But IBM has released a new system using Watson to use video to identify patient pain, heart rate and skin temperature with traditional and FLIR cameras. Leveraging the state of art emotion recognition algorithms Watson is able to identify patients struggling with pain in context of their condition and medications, and able to take corrective measures, either by alerting the nursing staff or dispensing medication as required. The system has been proven to be 300% more accurate over human screeners, and in once case Watson identified a pattern of pain that identified an unknown issue and saved the patients life. The majority of patients are aware of the system and happy that it is watching over them, better than a limited nursing staff could do on its own.

06_sectionTitle.jpg

Section 4


SCROLL DOWN

Section 4


 
 
 

Where will Computer Vision be in 2020?

What is computer vision? It is the ability for a computer to use a camera and process the incoming image to discern something from it. We see it today in home security cameras that can setup zones to watch for, facial recognition to unlock devices, cameras that can find a face and focus on them, cars that are using cameras to help understand obstacles on the road, and social media that can recognize our friends. Where will that leave us in 2020?

By 2020 our sensors for capturing the visual world will improve and we will be able to capture a better visual representation of the world. There will be a budding area of light field video captures and computer vision algorithms will be applied to it to better make sense of what the sensors are seeing. Light field capture differs from traditional image capture in that it is able to capture light coming in from multiple directions at every capture point. As sensors capture more visual data about the world we will continue to push the envelope of what single systems, especially mobile systems, can process on their own. Additionally even as sensors are better able to capture our visual world computer vision is limited to a visual representation of the world and there are many other ways of sensing the world. To overcome the limits of sensors and processors we will see greater uses of combined sensors as well as systems that rely on external systems for processing.

Traditionally combining multiple sensors on a system is known as sensor fusion. An example of sensor fusion with computer vision today is the combination of a camera, accelerometer, gyroscope, and compass data to determine depth from an image. By 2020 I anticipate sensor fusion that extends beyond single systems. For example multiple devices will share their sensor data to a cloud based system that can then apply machine learning to make better sense of the surroundings. A single description or simulation can be computed by a cloud-based system that is then shared by multiple systems. Today we collect traffic and road condition data from multiple mobile devices and share traffic conditions between them. By 2020 we should be able to share visual, depth, acoustic, and other sensor data that is processed by a cloud-based system. The increase in sensor data from multiple systems will need to leverage machine learning to better understand and feed the multiple systems. In 2020 as the number of connected systems and systems with sensors increase, computer vision will be just one of the fundamental tools utilized by cloud-based systems that will feed a shared understanding of the world.

 

 
07_sectionTitle.jpg

Section 5


SCROLL DOWN

Section 5


 
 
 

From Nate - needs a title

Gartner predicts the number of internet connected devices will reach 25 million by the year 2020. Everything from cars to toaster ovens are being outfitted with microcomputers equipped with sensors and wireless radios for data collection. These devices will produces a massive amount of data about where we eat (or did we order delivery from our smart fridge,) where we go, how we got there (Uber or a connected car,) and who we interacted with. All the while, biometric sensors in watches and fitness bands give a pretty good idea of how we are feeling (excited, bored, sleepy) at any given moment. In it's current state, most of this data is merely collected, to be analyzed later.  By 2020, we'll see all this data aggregated and analyzed in real time. Machine learning techniques will be utilized to radically personalize experiences in all aspects of our lives.  This real time information has potential to disrupt almost every industry, as people come to expect services to be pushed rather than pulled. You might get a push notification saying it is time to order an Uber to get to an appointment on time (or maybe it orders it automatically and alerts you when the car has arrived.) Your fridge might provide a high protien snack recommendation to help meet your custom nutrition macros after a particularly tough workout.  Seamless integration will be the norm, and products and services that don't adapt will fall out favor and be forgotten.

 
08_sectionTitle.jpg

Section 6


SCROLL DOWN

Section 6


 
 
 

Medical

As our society becomes increasingly aware of the need to support therapy, mental health and wellness, I see great opportunities for advancements in design and technology within medical fields.

VR Therapy

Experiences for exposure therapy which provide for a controlled environment in which people can face their fears (like fear of flying) or practice coping strategie already exist and are seeing success. I anticipate expansions on current AR therapies such as those for Phantom Limb Pain (PLP), toward helping people adjust to extreme changes in their physical condition. The use of AR/VR as therapy will also improve the quality of life of people with limited means of mobility, giving them access to broader experiences when they'd otherwise be confined. In the near future we could expect these uses of AR/VR to become as accessible as a local yoga studio or wellness room.

Rehabilitation with Robotics

The demand for rehabilitation services is growing as the population of senior citizens increases. According to the World Health Organization, the number of senior citizens at least 65 years of age will increase in number by 88% in the coming years. There is both a need and an opportunity to deploy technologies such as robotics to assist recovery from injuries or after a stroke. Research has found that by actively engaging stroke patients in repetitive tasks, the brain is able to rewire neurological pathways to motor functions to relearn movement. Functional recovery can continue years after the brain injury. By integrating these therapies with a person’s everyday life, they become more accessible and easier to continue.

Pain & Drug Management

Since 2002, what appear to be garden-variety medication carts travel the halls at the University of Maryland Medical Center in Baltimore, delivering medications to 3 critical-care units. The self-powered carts can move along the halls alone; they can call elevators and board them all by themselves. When they arrive at their destinations, they announce themselves and then allow only authorized personnel access to the medications they are delivering.

The use of this robots technology could be incredibly useful as part of a futuristic at-home medical cabinet. It could help manage supplies of standard medications such as allergy pills or OTC cold medications. For prescription medications, household members could receive notifications to take their medication from the unit or a portable companion dispensing device. The units could track if you’ve already taken a dose or allow you to easily contact your medical care team.

 
09_sectionTitle.jpg

Section 7


SCROLL DOWN

Section 7


 
 
 

From Ed

80 billion devices. In a world not too far away we have more connected devices that we have people on Earth. But what does this really mean? Sure there will be an economic impact of buyers, sellers, aftermarket, along with supporting infrastructure to support these devices. But, Still, What does this really mean? Our lives will connect to technology in a way that has never been imagined. We will directly interact with this new technology. But the real intrigue comes from the technology that we are not interacting with. The true game changer is the technology that listens, infers, is aware, and builds context.  In 2012 Google released Google Now that allowed Google to be always listening in the pockets of android users. In 2015, Amazon released Amazon Echo, a speaker that included voice activation and integration of your smart devices. This was only a taste of what the future held. Lets take a short trip showing how technology is being used today through the eyes of Ted.

Ted wakes up to the gradual fade in of his favored oldies group. As Ted tosses in bed the lights slowly turn on giving Teds eyes ample time to adjust. He finally gets out of bed and heads to the bathroom. In the bathroom his smart mirror offers Ted a cup of coffee before his shower. Ted heads to the kitchen as his cup of coffee finishes pouring. After enjoying his cup of coffee he grabs his towel, just as the shower turns on to his preferred temperature. On the way to work Ted is offered 3 lunch options in his car.

The Regular – your local sandwich shop that you can order your normal order by a press of a button, auto pay, and walk in and pick up just as they finish preparing your order.

The Rusher – Sandwich of the day delivered directly to you in your office

The Offer – A new lunch spot has opened near your office and presents a sample menu based on your preferences.

Ted selects The Rusher and focuses on his commute. Over lunch, Ted has a conversation with his co-workers about wanting to take a vacation while enjoying his Vietnamese sandwich.

On his way home his car prompts him if he enjoyed his lunch, which he did.

While Ted is washing up for bed, his mirror edges begin to be taken over by lush green trees and a image of a beach fades in behind Teds reflection.  The message “How about a trip to south Asia?” Ted swipes up to create a travel watch alert for a vacation.

Teds story is a vision of where we will be in 2020. Our lives will be filled with technology to infer context by sensors listing to location, audio, visual, touch, and may other inputs. These enchanted objects will generate petabytes of data what will be impossible for a single person to decipher.

Luckily we have Artifical Intellegance engines to listen to our conversations, and preform the heavy lifting of building a context based on all of the available inputs. As these engines become more powerful we will see them used for social encouragement, such as meeting Teds life goals. This is also expressed in subtle ways such as providing suggestions for Ted’s lunch recommendations. These AI systems will be fully integrated with larger system. When the mirror offered Ted a cup of coffee, the system was actually delaying his shower so the buildings hot water boiler could refill.

No matter how robust the system is, Ted is still the center point. Ted is engaged to train and refine the recommendation engine. This could be through swiping up to look for a vacation, or picking a lunch. Choice is the most important factor for the future. We will need tools that present simple choices based on rich datasets, which are easily explorable. This practice of Data Visualization will be critical in developing and maintaining systems that enrich our daily life.

 

 
 
 
 
 

Emerging Experiences, part of Razorfish’s Experience Innovation service offering, is a global practice of makers, creators, technologists and dreamers that helps clients blend digital and physical worlds through customer obsession. We envision, incubate and launch transformational experiences—which are often crafted, prototyped and tested within one of our Emerging Experiences Labs—in order to innovate tomorrow, today.