Engineering Archives - Investment Capital Growth

The Investment Capital Growth Blog

Welcome To The ICG Blog

Strategic Insights For Business Leaders & Their Teams

Investment Capital Growth is dedicated to the personal and professional development of C-Level Executives and the management teams that run modern business. Our blog shares insights and strategies culled from years of entrepreneural and executive experience. Our thought leaders regularly publish business articles to inspire and empower.

Get Inspired, Stay Connected:

  • Subscribe To Our Blog For Updates
  • Follow ICG on Social Media
  • Engage Our Consultants

Subscribe To The ICG Blog

ICG Newsletter Signup

ICG's Monthly Newsletter delivers insightful and actionable information for business owners and their teams. Get the latest updates from the ICG team each month including exclusive case studies, expert commentary, special offers and real life examples of business success. Join the thousands of subscribers that enjoy our informative publication by entering your contact information below.

Contact us.

How Augmented Reality (AR) will change your industry

Posted by Cliff Locks On October 16, 2019 at 10:06 am / In: Uncategorized

How Augmented Reality (AR) will change your industry

Augmented Reality (AR) has already exceeded over 2,000 AR apps on over 1.4 billion active iOS devices. Even if on a rudimentary level, the technology is now permeating the consumer products space.

And in just the next four years, the International Data Corporation (IDC) forecasts AR headset production will surge 141 percent each year, reaching a whopping 32 million units by 2023.

AR will soon serve as a surgeon’s assistant, a sales agent, and an educator, personalized to your kids’ learning patterns and interests.

In this fourth installment of our five-part AR series, I’m doing a deep dive into AR’s most exciting industry applications, poised to hit the market in the next 5-10 years.

Let’s dive in.

Healthcare 

(1) Surgeons and physicians: 

Whether through detailed and dynamic anatomical annotations or visualized patient-specific guidance, AR will soon augment every human medical practitioner.

To start, AR is already being used as a diagnosis tool. SyncThink, recently hired by Magic Leap, has developed eye-tracking technology to diagnose concussions and balance disorders. Yet another startup, XRHealth, launched its ARHealth platform on Magic Leap to aid in rehabilitation, pain distraction, and psychological assessment.

SyncThink

Moreover, surgeons at the Imperial College London have used Microsoft’s HoloLens 1 in pre-operative reconstructive and plastic surgery procedures, which typically involves using CT scans to map blood vessels that supply vital nutrients during surgery.

As explained by the project’s senior researcher, Dr. Philip Pratt, “With the HoloLens, we’re now doing the same kind of [scan] and then processing the data captured to make it suitable to look at. That means we end up with a silhouette of a limb, the location of the injury, and the course of the vessels through the area, as opposed to this grayscale image of a scan and a bit more guesswork.”

Dramatically lowering associated risks, AR can even help surgeons visualize the depth of vessels and choose the optimal incision location.

And while the HoloLens 1 was only used in pre-op visualizations, Microsoft’s HoloLens 2 is on track to reach the operating table. Take Philips’ Azurion image-guided therapy platform, for instance. Built specifically for the HoloLens 2, Azurion strives to provide surgeons with real-time patient data and dynamic 3D imagery as they operate.

Moreover, AR headsets and the virtual overlays they provide will exponentially improve sharing of expertise across hospitals and medical practices. Niche medical specialists will be able to direct surgeons remotely from across the country (not to mention the other side of the planet), or even view annotated AR scans to offer their advice.

Magic Leap, in its own right, is now collaborating with German medical company Brainlab to create a 3D spatial viewer that would allow clinicians to work together in surgical procedures across disciplines.

Brain Lab

But beyond democratizing medical expertise, AR will even provide instantaneous patient histories, gearing doctors with AI-processed information for more accurate diagnoses in a fraction of the time.

By saving physicians’ time, AR will therefore free doctors to spend a greater percentage of their day engaging in face-to-face contact with their patients, establishing trust, compassion, and an opportunity to educate healthcare consumers (rather than merely treating them).

And when it comes to digital records, doctors can simply use voice control to transcribe entire interactions and patient visits, multiplying what can be done in a day, and vastly improving the patient experience.

(2) Assistance for those with disabilities: 

Today, over 3.4 million visually impaired individuals reside in the U.S. alone. But thanks to new developments in the AI-integrated smart glasses realm, associated constraints could soon fade in severity.

And new pioneers continue to enter the market, including NavCog, Horus, AIServe, and MyEye, among others. Microsoft has even begun development of a “Seeing AI” app, which translates the world into audio descriptions for the blind, as seen through a smartphone’s camera lens.

Vision of Children Foundation

During the Reality Virtual Hackathon in January, hosted by Magic Leap at MIT, two of the top three winners catered to disabilities. CleARsite provided environment reconstruction, haptic feedback, and Soundfield Audio overlay to enhance a visually impaired individual’s interaction with the world. Meanwhile, HeAR used a Magic Leap 1 headset to translate vocals or sign language into readable text in speech bubbles in the user’s field of view. Magic Leap remains dedicated to numerous such applications, each slated to vastly improve quality of life.

(3) Biometric displays:

In biometrics, cyclist sunglasses and swimmer goggles have evolved into the perfect medium for AR health metric displays. Smart glasses like the Solos ($499) and Everysight Raptors ($599) provide cyclists with data on speed, power, and heart rate, along with navigation instructions. Meanwhile, Form goggles ($199)—just released at the end of August—show swimmers their pace, calories burned, distance, and stroke count in real-time, up to 32 feet underwater.

SENTHIN1 Scientific Triathlon

Accessible health data will shift off our wrists and into our fields of view, offering us personalized health recommendations and pushing our training limits alike.

Retail & Advertising

(1) Virtual shopping:

The year is 2030. Walk into any (now AI-driven, sensor-laden, and IoT-retrofitted) store, and every mannequin will be wearing a digital design customized to your preferences. Forget digging through racks of garments or hunting down your size. Cross-referencing your purchase history, gaze patterns, and current closet inventory, AIs will display tailor-made items most suitable for your wardrobe, adjusted to your individual measurements.

Automation Tags – Pricing

An app available on most Android smartphones, Google Lens is already leaping into this marketplace, allowing users to scan QR codes and objects through their smartphone cameras. Within the product, Google Lens’s Style Match feature even gives consumers the capability to identify pieces of clothing or furniture and view similar designs available online and through e-commerce platforms.

(2) Advertising:

And these mobile AR features are quickly encroaching upon ads as well.

In July, the New York Times debuted an AR ad for Netflix’s “Stranger Things,” for instance, guiding smartphone users to scan the page with their Google Lens app and experience the show’s fictional Starcourt Mall come to life.

APP DEVELOPER MAGAZINE
Source: App Developer Magazine.

But immersive AR advertisements of the future won’t all be unsolicited and obtrusive. Many will likely prove helpful.

As you walk down a grocery store aisle, discounts and special deals on your favorite items might populate your AR smart glasses. Or if you find yourself admiring an expensive pair of pants, your headset might suggest similar items at a lower cost, or cheaper distributors with the same product. Passing a stadium on the way to work, next weekend’s best concert ticket deals might filter through your AR suggestions—whether your personal AI intends them for your friend’s upcoming birthday or your own enjoyment.

Instead of bombarding you at every turn on a needed handheld device, ads will appear only when most relevant to your physical surroundings— or toggle them off, and have your personal AI do the product research for you.

Education & Travel

(1) Customized, continuous learning:

The convergence of today’s AI revolution with AR advancements gives us the ability to create individually customized learning environments.

Throw sensors in the mix for tracking of neural and physiological data, and students will soon be empowered to better mediate a growth mindset, and even work towards achieving a flow state (which research shows can vastly amplify learning).

AR – Travel

Within the classroom, Magic Leap One’s Lumin operating system allows multiple wearers to share in a digital experience, such as a dissection or historical map. And from a collaborative creation standpoint, students can use Magic Leap’s CAD application to join forces on 3D designs.

In success, AR’s convergence with biometric sensors and AI will give rise to an extraordinarily different education system: one comprised of delocalized, individually customizable, responsive, and accelerated learning environments.

Continuous and learn-everywhere education will no longer be confined to the classroom. Already, numerous AR mobile apps can identify objects in a user’s visual field, instantaneously presenting relevant information. As user interface hardware undergoes a dramatic shift in the next decade, these software capabilities will only explode in development and use.

Gazing out your window at a cloud will unlock interactive information about the water cycle and climate science. Walking past an old building, you might effortlessly learn about its history dating back to the sixteenth century. I often discuss information abundance, but it is data’s accessibility that will soon drive knowledge abundance. 

(2) Training:

AR will enable on-the-job training at far lower costs in almost any environment, from factories to hospitals.

Smart glasses are already beginning to guide manufacturing plant employees as they learn how to assemble new equipment. Retailers stand to decimate the time it takes to train a new employee with AR tours and product descriptions.

And already, automotive technicians can better understand the internal components of a vehicle without dismantling it. Jaguar Land Rover, for instance, has recently implemented Bosch’s Re’flekt One AR solution. Training technicians with “x-ray” vision, the AR service thereby allows them to visualize the insides of Range Rover Sport vehicles without removing their dashboards.

In healthcare, medical students will be able to practice surgeries on artificial cadavers with hyper-realistic AR displays. Not only will this allow them to rapidly iterate on their surgical skills, but AR will dramatically lower the cost and constraints of standard medical degrees and specializations.

Meanwhile, sports training in simulators will vastly improve with advanced AR headset technology. Even practicing chess or piano will be achievable with any tabletop surface, allowing us to hone real skills with virtual interfaces.

(3) Travel:

As with most tasks, AI’s convergence with AR glasses will allow us to outsource all the most difficult (and least enjoyable) decisions associated with travel, whether finding the best restaurants or well-suited local experiences.

But perhaps one of AR’s more sophisticated uses (already rolling out today) involves translation. Whether you need to decode a menu or access subtitles while conversing across a language barrier, instantaneous translation is about to improve exponentially with the rise of AI-powered AR glasses. Even today, Google Translate can already convert menu text and street signs in real time through your smartphone.

Manufacturing

As I explored last week, manufacturing presents the nearest-term frontier for AR’s commercial use. As a result, many of today’s leading headset companies—including Magic Leap, Vuzix, and Microsoft—are seeking out initial adopters and enterprise applications in the manufacturing realm.

ARM BLUEPRINT
Source: Arm Blueprint.

(1) Design:

Targeting the technology for simulation purposes, Airbus launched an AR model of the MRH-90 Taipan aircraft just last year, allowing designers and engineers to view various components, potential upgrades, and electro-optical sensors before execution. Saving big on parts and overhead costs, Airbus thereby gave technicians the opportunity to make important design changes without removing their interaction with the aircraft.

(2) Supply chain optimization: 

AR guidance linked to a centralized AI will also mitigate supply chain inefficiencies. Coordinating moving parts, eliminating the need to hold a scanner at each checkpoint, and directing traffic within warehouses will vastly improve workflow.

After initially implementing AR “vision picking” in 2015, leading supply company DHL recently announced it would continue to use the newest Google smart lens in warehouses across the world. Or take automotive supplier ZF, which has now rolled out use of the HoloLens in plant maintenance.

Jasoren -notice the green arrow projected on the floor and the product photo and pick quantity. Improve your order picking and reduce costs.

(3) Quality assurance & accessible expertise:

AR technology will also play a critical role in quality assurance, as it already does in Porsche’s assembly plant in Leipzig, Germany. Whenever manufacturers require guidance from engineers, remote assistance is effectively no longer remote, as equipment experts guide employees through their AR glasses and teach them on the job.

Transportation & Navigation

(1) Autonomous vehicles:

To start, Nvidia’s Drive platform for Level 2+ autonomous vehicles is already combining sensor fusion and perception with AR dashboard displays to alert drivers of road hazards, highlight points of interest, and provide navigation assistance.

NEXT REALITY
Source: Next Reality – Augmented Reality News.

And in our current transition phase of partially autonomous vehicles, such AR integration allows drivers to monitor conditions yet eases the burden of constant attention to the road. Along these lines, Volkswagen has already partnered with Nvidia to produce I.D. Buzz electric cars, set to run on the Drive OS by 2020. And Nvidia’s platform is fast on the move, having additionally partnered with Toyota, Uber, and Mercedes-Benz. Within just the next few years, AR displays may be commonplace in these vehicles.

(2) Navigation:

THE VERGE
Source: The Verge.

We’ve all seen (or been) that someone spinning around with their smartphone to decipher the first few steps of a digital map’s commands. But AR is already making everyday navigation intuitive and efficient.

Google Maps’ AR feature has already been demoed on Pixel phones: instead of staring at your map from a bird’s eye view, users direct their camera at the street, and superimposed directions are immediately layered virtually on top.

Not only that, but as AI identifies what you see, it instantaneously communicates with your GPS to pinpoint your location and orientation. Although a mainstream rollout date has not yet been announced, this feature will likely make it to your phone in the very near future.

Entertainment

(1) Gaming:

We got our first taste of AR’s real-world gamification in 2016, when Nintendo released Pokémon Go. And today, the gaming app has now surpassed 1 billion downloads. But by contrast to VR, AR is increasingly seen as a medium for bringing gamers together in the physical world, encouraging outdoor exploration, activity, and human connection in the process.

And in the recently exploding eSports industry, AR has the potential to turn player’s screens into live action stadiums. Just this year, the global eSports market is projected to exceed US$1.1 billion in revenue, and AR’s potential to elevate the experience will only see this number soar.

(2) Art:

Many of today’s most popular AR apps allow users to throw dinosaurs into their surroundings (Monster Park), learn how to dance (Dance Reality), or try on highly convincing virtual tattoos (InkHunter).

And as high-definition rendering becomes more commonplace, art will, too, grow more and more accessible.

Magic Leap aims to construct an entire “Magicverse” of digital layers superimposed on our physical reality. Location-based AR displays, ranging from art installations to gaming hubs, will be viewable in a shared experience across hundreds of headsets. Individuals will simply toggle between modes to access whichever version of the universe they desire. Endless opportunities to design our surroundings will arise.  

Apple, in its own right, recently announced the company’s [AR]T initiative, which consists of floating digital installations. Viewable through [AR]T Viewer apps in Apple stores, these installations can also be found in [AR]T City Walks guiding users through popular cities, and [AR]T Labs, which teach participants how to use Swift Playgrounds (an iPad app) to create AR experiences.

(3) Shows:

And at the recent Siggraph Conference in Los Angeles, Magic Leap introduced an AR-theater hybrid called Mary and the Monster, wherein viewers watched a barren “diorama-like stage” come to life in AR.

VENTURE BEAT

Source: Venture Beat.

While audience members shared the common experience like a traditional play, individuals could also zoom in on specific actors to observe their expressions more closely.

Say goodbye to opera glasses and hello to AR headsets.

Final Thoughts

While AR headset manufacturers and mixed reality developers race to build enterprise solutions from manufacturing to transportation, AR’s use in consumer products is following close behind.

Magic Leap leads the way in developing consumer experiences we’ve long been waiting for, as the “Magicverse” of localized AR displays in shared physical spaces will reinvent our modes of connection.

And as AR-supportive hardware is now built into today’s newest smartphones, businesses have an invaluable opportunity to gamify products and immerse millions of consumers in service-related AR experiences.

Even beyond the most obvious first-order AR business cases, new industries to support the augmented world of 2030 will soon surge in market competition, whether headset hardware, data storage solutions, sensors, or holograph and projection technologies.

Jump on the bandwagon now— the future is faster than you think!

Board of Directors | Board of Advisors | Strategic Leadership

Please keep me in mind as your Executive Coach, openings for Senior Executive Engagements, and Board of Director openings. If you hear of anything within your network that you think might be a positive fit, I’d so appreciate if you could send a heads up my way. Email me: [email protected] or Schedule a call: Cliff Locks

Download Resume

Download Resume (PDF)

#BoardofDirectors #BoD #artificialintelligence #AI #innovation #IoT #virtualreality #vr #AR #augmentedreality #HR #executive #business #CXO #CEO #CFO #CIO #BoardofDirectors #executive #success #work #follow #leadership #Engineering #corporate #office #Biotech #Cleantech #CAD #entrepreneur #coaching #businessman #professional #excellence #development #motivation Contributors: Peter Diamandis and Clifford Locks #InvestmentCapitalGrowth

How AR, AI, Sensors & Blockchain are Merging Into Web 3.0

Posted by Cliff Locks On September 18, 2019 at 10:31 am / In: Uncategorized

How AR, AI, Sensors & Blockchain are Merging Into Web 3.0

How each of us sees the world is about to change dramatically…

For all of human history, the experience of looking at the world was roughly the same for everyone. But boundaries between the digital and physical are beginning to fade.

The world around us is gaining layer upon layer of digitized, virtually overlaid information — making it rich, meaningful, and interactive. As a result, our respective experiences of the same environment are becoming vastly different, personalized to our goals, dreams, and desires.

Welcome to Web 3.0, aka The Spatial Web. In version 1.0, static documents and read-only interactions limited the internet to one-way exchanges. Web 2.0 provided quite an upgrade, introducing multimedia content, interactive web pages, and participatory social media. Yet, all this was still mediated by 2D screens.

And today, we are witnessing the rise of Web 3.0, riding the convergence of high-bandwidth 5G connectivity, rapidly evolving AR eyewear, an emerging trillion-sensor economy, and ultra-powerful AIs.

As a result, we will soon be able to superimpose digital information atop any physical surrounding—freeing our eyes from the tyranny of the screen, immersing us in smart environments, and making our world endlessly dynamic.

In this third blog of our five-part series on augmented reality, we will explore the convergence between AR, AI, sensors, and blockchain, diving into the implications through a key use case in manufacturing.

A Tale of Convergence

Let’s deconstruct everything beneath the sleek AR display.

It all begins with Graphics Processing Units (GPUs) — electric circuits that perform rapid calculations to render images. (GPUs can be found in mobile phones, game consoles, and computers.)

However, because AR requires such extensive computing power, single GPUs will not suffice. Instead, blockchain can now enable distributed GPU processing power, and blockchains specifically dedicated to AR holographic processing are on the rise.

Next up, cameras and sensors will aggregate real-time data from any environment to seamlessly integrate physical and virtual worlds. Meanwhile, body-tracking sensors are critical for aligning a user’s self-rendering in AR with a virtually enhanced environment. Depth sensors then provide data for 3D spatial maps, while cameras absorb more surface-level, detailed visual input. In some cases, sensors might even collect biometric data, such as heart rate and brain activity, to incorporate health-related feedback in our everyday AR interfaces and personal recommendation engines.

The next step in the pipeline involves none other than AI. Processing enormous volumes of data instantaneously, embedded AI algorithms will power customized AR experiences in everything from artistic virtual overlays to personalized dietary annotations.

In retail, AIs will use your purchasing history, current closet inventory, and possibly even mood indicators to display digitally rendered items most suitable for your wardrobe, tailored to your measurements.

In healthcare, smart AR glasses will provide physicians with immediately accessible and maximally relevant information (parsed from the entirety of a patient’s medical records and current research) to aid in accurate diagnoses and treatments, freeing doctors to engage in the more human-centric tasks of establishing trust, educating patients and demonstrating empathy.

Convergence in Manufacturing

One of the nearest-term use cases of AR is manufacturing, as large producers begin dedicating capital to enterprise AR headsets. And over the next ten years, AR will converge with AI, sensors, and blockchain to multiply manufacturer productivity and employee experience.

(1) Convergence with AI

In initial application, digital guides superimposed on production tables will vastly improve employee accuracy and speed, while minimizing error rates.

Already, the International Air Transport Association (IATA) — whose airlines supply 82 percent of air travel — recently implemented industrial tech company Atheer’s AR headsets in cargo management. And with barely any delay, IATA reported a whopping 30 percent improvement in cargo handling speed and no less than a 90 percent reduction in errors.

With similar success rates, Boeing brought Skylight’s smart AR glasses to the runway, now used in the manufacturing of hundreds of airplanes. Sure enough—the aerospace giant has now seen a 25 percent drop in production time and near-zero error rates.

Beyond cargo management and air travel, however, smart AR headsets will also enable on-the-job training without reducing the productivity of other workers or sacrificing hardware. Jaguar Land Rover, for instance, implemented Bosch’s Re’flekt One AR solution to gear technicians with “x-ray” vision: allowing them to visualize the insides of Range Rover Sport vehicles without removing any dashboards.

And as enterprise capabilities continue to soar, AIs will soon become the go-to experts, offering support to manufacturers in need of assembly assistance. Instant guidance and real-time feedback will dramatically reduce production downtime, boost overall output, and even help customers struggling with DIY assembly at home.

Perhaps one of the most profitable business opportunities, AR guidance through centralized AI systems will also serve to mitigate supply chain inefficiencies at extraordinary scale. Coordinating moving parts, eliminating the need for manned scanners at each checkpoint, and directing traffic within warehouses, joint AI-AR systems will vastly improve workflow while overseeing quality assurance.

After its initial implementation of AR “vision picking” in 2015, leading courier company DHL recently announced it would continue to use Google’s newest smart lens in warehouses across the world. Motivated by the initial group’s reported 15 percent jump in productivity, DHL’s decision is part of the logistics giant’s $300 million investment in new technologies.

And as direct-to-consumer e-commerce fundamentally transforms the retail sector, supply chain optimization will only grow increasingly vital. AR could very well prove the definitive step for gaining a competitive edge in delivery speeds.

As explained by Vital Enterprises CEO Ash Eldritch, “All these technologies that are coming together around artificial intelligence are going to augment the capabilities of the worker and that’s very powerful. I call it Augmented Intelligence. The idea is that you can take someone of a certain skill level and by augmenting them with artificial intelligence via augmented reality and the Internet of Things, you can elevate the skill level of that worker.”

Already, large producers like Goodyear, thyssenkrupp, and Johnson Controls are using the Microsoft HoloLens 2—priced at $3,500 per headset—for manufacturing and design purposes.

Perhaps the most heartening outcome of the AI-AR convergence is that, rather than replacing humans in manufacturing, AR is an ideal interface for human collaboration with AI. And as AI merges with human capital, prepare to see exponential improvements in productivity, professional training, and product quality.

(2) Convergence with Sensors

On the hardware front, these AI-AR systems will require a mass proliferation of sensors to detect the external environment and apply computer vision in AI decision-making.

To measure depth, for instance, some scanning depth sensors project a structured pattern of infrared light dots onto a scene, detecting and analyzing reflected light to generate 3D maps of the environment. Stereoscopic imaging, using two lenses, has also been commonly used for depth measurements. But leading technology like Microsoft’s HoloLens 2 and Intel’s RealSense 400-series camera implement a new method called “phased time-of-flight” (ToF).

In ToF sensing, the HoloLens 2 uses numerous lasers, each with 100 milliwatts (mW) of power, in quick bursts. The distance between nearby objects and the headset wearer is then measured by the amount of light in the return beam that has shifted from the original signal. Finally, the phase difference reveals the location of each object within the field of view, which enables accurate hand-tracking and surface reconstruction.

With a far lower computing power requirement, the phased ToF sensor is also more durable than stereoscopic sensing, which relies on the precise alignment of two prisms. The phased ToF sensor’s silicon base also makes it easily mass-produced, rendering the HoloLens 2 a far better candidate for widespread consumer adoption.

To apply inertial measurement—typically used in airplanes and spacecraft—the HoloLens 2 additionally uses a built-in accelerometer, gyroscope, and magnetometer. Further equipped with four “environment understanding cameras” that track head movements, the headset also uses a 2.4MP HD photographic video camera and ambient light sensor that work in concert to enable advanced computer vision.

For natural viewing experiences, sensor-supplied gaze tracking increasingly creates depth in digital displays. Nvidia’s work on Foveated AR Display, for instance, brings the primary foveal area into focus, while peripheral regions fall into a softer background— mimicking natural visual perception and concentrating computing power on the area that needs it most.

Gaze tracking sensors are also slated to grant users control over their (now immersive) screens without any hand gestures. Conducting simple visual cues, even staring at an object for more than three seconds, will activate commands instantaneously.

And our manufacturing example above is not the only one. Stacked convergence of blockchain, sensors, AI and AR will disrupt almost every major industry.

Take healthcare, for example, wherein biometric sensors will soon customize users’ AR experiences. Already, MIT Media Lab’s Deep Reality group has created an underwater VR relaxation experience that responds to real-time brain activity detected by a modified version of the Muse EEG. The experience even adapts to users’ biometric data, from heart rate to electro dermal activity (inputted from an Empatica E4 wristband).

Now rapidly dematerializing, sensors will converge with AR to improve physical-digital surface integration, intuitive hand and eye controls, and an increasingly personalized augmented world. Keep an eye on companies like MicroVision, now making tremendous leaps in sensor technology.

While I’ll be doing a deep dive into sensor applications across each industry in our next blog, it’s critical to first discuss how we might power sensor- and AI-driven augmented worlds.

(3) Convergence with Blockchain

Because AR requires much more compute power than typical 2D experiences, centralized GPUs and cloud computing systems are hard at work to provide the necessary infrastructure. Nonetheless, the workload is taxing and blockchain may prove the best solution.

A major player in this pursuit, Otoy aims to create the largest distributed GPU network in the world, called the Render Network RNDR. Built specifically on the Ethereum blockchain for holographic media, and undergoing Beta testing, this network is set to revolutionize AR deployment accessibility.

Alphabet Chairman Eric Schmidt (an investor in Otoy’s network), has even said, “I predicted that 90% of computing would eventually reside in the web based cloud… Otoy has created a remarkable technology which moves that last 10%—high-end graphics processing—entirely to the cloud. This is a disruptive and important achievement. In my view, it marks the tipping point where the web replaces the PC as the dominant computing platform of the future.”

Leveraging the crowd, RNDR allows anyone with a GPU to contribute their power to the network for a commission of up to $300 a month in RNDR tokens. These can then be redeemed in cash or used to create users’ own AR content.

In a double win, Otoy’s blockchain network and similar iterations not only allow designers to profit when not using their GPUs, but also democratize the experience for newer artists in the field.

And beyond these networks’ power suppliers, distributing GPU processing power will allow more manufacturing companies to access AR design tools and customize learning experiences. By further dispersing content creation across a broad network of individuals, blockchain also has the valuable potential to boost AR hardware investment across a number of industry beneficiaries.

On the consumer side, startups like Scanetchain are also entering the blockchain-AR space for a different reason. Allowing users to scan items with their smartphone, Scanetchain’s app provides access to a trove of information, from manufacturer and price, to origin and shipping details.

Based on NEM (a peer-to-peer cryptocurrency that implements a blockchain consensus algorithm), the app aims to make information far more accessible and, in the process, create a social network of purchasing behavior. Users earn tokens by watching ads, and all transactions are hashed into blocks and securely recorded.

The writing is on the wall—our future of brick-and-mortar retail will largely lean on blockchain to create the necessary digital links.

Final Thoughts

Integrating AI into AR creates an “auto-magical” manufacturing pipeline that will fundamentally transform the industry, cutting down on marginal costs, reducing inefficiencies and waste, and maximizing employee productivity.

Bolstering the AI-AR convergence, sensor technology is already blurring the boundaries between our augmented and physical worlds, soon to be near-undetectable. While intuitive hand and eye motions dictate commands in a hands-free interface, biometric data is poised to customize each AR experience to be far more in touch with our mental and physical health.

And underpinning it all, distributed computing power with blockchain networks like RNDR will democratize AR, boosting global consumer adoption at plummeting price points.

As AR soars in importance—whether in retail, manufacturing, entertainment, or beyond—the stacked convergence discussed above merits significant investment over the next decade. Already, 52 Fortune 500 companies have begun testing and deploying AR/VR technology. And while global revenue from AR/VR stood at $5.2 billion in 2016, market intelligence firm IDC predicts the market will exceed $162 billion in value by 2020.

The augmented world is only just getting started.

Board of Directors | Board of Advisors | Strategic Leadership

Please keep me in mind as your Executive Coach, openings for Senior Executive Engagements, and Board of Director openings. If you hear of anything within your network that you think might be a positive fit, I’d so appreciate if you could send a heads up my way. Email me: [email protected] or Schedule a call: Cliff Locks

Download Resume

Download Resume (PDF)

#BoardofDirectors #BoD #artificialintelligence #AI #innovation #IoT #virtualreality #vr #d #augmentedreality #HR #executive #business #CXO #CEO #CFO #CIO BoardofDirectors #executive #success #work #follow #leadership #Engineering #corporate #office #Biotech #Cleantech #CAD #entrepreneur #coaching #businessman #professional #excellence #development #motivation Contributors: Peter Diamandis and Clifford Locks #InvestmentCapitalGrowth

Smart Technology and Integration, How It’s Changing Our Lives

Posted by Cliff Locks On August 28, 2019 at 10:15 am / In: Uncategorized

Smart Technology and Integration, How It’s Changing Our Lives

Each week alone, an estimated 1.3 million people move into cities, driving urbanization on an unstoppable scale. 

By 2040, about two-thirds of the world’s population will be concentrated in urban centers. Over the decades ahead, 90 percent of this urban population growth is predicted to flourish across Asia and Africa.

Already, 1,000 smart city pilots are under construction or in their final urban planning stages across the globe, driving forward countless visions of the future.

As data becomes the gold of the 21st century, centralized databases and hyper-connected infrastructures will enable everything from sentient cities that respond to data inputs in real time, to smart public services that revolutionize modern governance. 

Connecting countless industries — real estate, energy, sensors and networks, transportation, among others — tomorrow’s cities pose no end of creative possibilities and stand to completely transform the human experience.

In this blog, we’ll be taking a high-level tour of today’s cutting-edge urban enterprises involved in these three areas:

  1. Hyperconnected urban ecosystems that respond to your data
  2. Smart infrastructure and construction
  3. Self-charging green cities

Let’s dive in!

Smart Cities that Interact with Your Data

Any discussion of smart cities must also involve today’s most indispensable asset: data.

As 5G connection speeds, IoT-linked devices and sophisticated city AIs give birth to trillion-sensor economies, low latencies will soon allow vehicles to talk to each other and infrastructure systems to self-correct.

Even public transit may soon validate your identity with a mere glance in any direction, using facial recognition to charge you for individualized travel packages and distances.

As explained by Deloitte Public Sector Leader Clare Ma, “real-time information serves as the ‘eye’ for urban administration.”

In most cities today, data is fragmented across corporations, SMEs, public institutions, nonprofits, and personal databases, with little standardization.

Yet to identify and respond to urban trends, we need a way of aggregating multiple layers of data, spanning traffic flows, human movement, individual transactions, shifts in energy usage, security activity, and almost any major component of contemporary economies.

Only through real-time analysis of information flows can we leverage exponential technologies to automate public services, streamlined transit, smarter security, optimized urban planning and responsive infrastructure.

And already, cutting-edge cities across the globe are building centralized data platforms to combine different standards and extract actionable insights, from smart parking to waste management. 

Take China’s Nanjing, for instance. 

With sensors installed in 10,000 taxis, 7,000 buses and over 1 million private vehicles, the city aggregates daily data across both physical and virtual networks. After transmitting it to the Nanjing Information Center, experts can then analyze traffic data, send smartphone updates to commuters and ultimately create new traffic routes.

Replacing the need for capital-intensive road and public transit reconstruction, real-time data from physical transit networks allow governments to maximize value of preexisting assets, saving time and increasing productivity across millions of citizens.

But beyond traffic routing, proliferating sensors and urban IoT are giving rise to real-time monitoring of any infrastructural system.

Italy’s major rail operator Trenitalia has now installed sensors on all its trains, deriving real-time status updates on each train’s mechanical condition. Now capable of calculating maintenance predictions in advance of system failure, transit disruptions are becoming a thing of the past. 

Los Angeles has embedded sensors in 4,500 miles worth of new LEDs (replacing previous streetlights). The minute one street bulb malfunctions or runs low, it can be fixed near-immediately, forming part of a proactive city model that detects glitches before they occur.

And Hangzhou, home to e-commerce giant Alibaba, has now launched a “City Brain” project, aiming to build out one of the most data-responsive cities on the planet.

With cameras and other sensors installed across the entire city, a centralized AI hub processes data on everything from road conditions to weather data to vehicular collisions and citizen health emergencies.

Overseeing a population of nearly 8 million residents, Hangzhou’s City Brain then manages traffic signals at 128 intersections (coordinating over 1,000 road signals simultaneously), tracks ambulances en-route and clears their paths to hospitals without risk of collision, directs traffic police to accidents at record rates, and even assists city officials in expedited decision-making. No more wasting time at a red light when there is obviously no cross traffic or pedestrians.

Already, the City Brain has cut ambulance and commuter traveling times by half. And as reported by China’s first AI-partnered traffic policeman Zheng Yijiong, “the City Brain can detect accidents within a second” allowing police to “arrive at [any] site [within] 5 minutes” across an urban area of over 3,000 square miles.

But beyond oversight of roads, traffic flows, collisions and the like, converging sensors and AI are now being used to monitor crowds and analyze human movement.

Companies like SenseTime now offer software to police bureaus that can not only identify live faces, individual gaits and car license plates, but even monitor crowd movement and detect unsafe pedestrian concentrations.

Some researchers have even posited the use of machine learning to predict population-level disease spread through crowd surveillance data, building actionable analyses from social media data, mass geolocation and urban sensors.

Yet aside from self-monitoring cities and urban AI ‘brains,’ what if infrastructure could heal itself on-demand. Forget sensors, connectivity and AI — enter materials science.

Self-Healing Infrastructure 

The U.S. Department of Transportation estimates a $542.6 billion backlog needed for U.S. infrastructure repairs alone.

And as I’ve often said, the world’s most expensive problems are the world’s most profitable opportunities.

Enter self-healing construction materials.

First up, concrete.

In an effort to multiply the longevity of bridges, roads, and any number of infrastructural fortifications, engineers at Delft University have developed a prototype of bio-concrete that can repair its own cracks.

Mixed in with calcium lactate, the key ingredients of this novel ‘bio-concrete’ are minute capsules of limestone-producing bacteria distributed throughout any concrete structure. Only when the concrete cracks, letting in air and moisture, does the bacteria awaken.

Like clockwork, the bacteria begins feeding on surrounding calcium lactate as it produces a natural limestone sealant that can fill cracks in a mere three weeks — long before small crevices can even threaten structural integrity.

As head researcher Henk Jonkers explains, “What makes this limestone-producing bacteria so special is that they are able to survive in concrete for more than 200 years and come into play when the concrete is damaged. […] If cracks appear as a result of  pressure on the concrete, the concrete will heal these cracks itself.”

Yet other researchers have sought to crack the code (no pun intended) of living concrete, testing everything from hydrogels that expand 10X or even 100X their original size when in contact with moisture, to fungal spores that grow and precipitate calcium carbonate the minute micro-cracks appear.

But bio-concrete is only the beginning of self-healing technologies. 

As futurist architecture firms start printing plastic and carbon-fiber houses, engineers are tackling self-healing plastic that could change the game with economies of scale. 

Plastic not only holds promise in real estate on Earth; it will also serve as a handy material in space. NASA engineers have pioneered a self-healing plastic that may prove vital in space missions, preventing habitat and ship ruptures in record speed. 

The implications of self-healing materials are staggering, offering us resilient structures both on earth and in space.

One additional breakthrough worth noting involves the magic of graphene.

Perhaps among the greatest physics discoveries of the century, graphene is composed of a 2D honeycomb lattice over 200X stronger than steel, yet remains an ultra-thin one atom thick. 

While yet to come down in cost, graphene unlocks an unprecedented host of possibilities, from weather-resistant and ultra-strong coatings for existing infrastructure, to multiplied infrastructural lifespans. Some have even posited graphene’s use in the construction of 30 km tall buildings.

And it doesn’t end there.

As biomaterials and novel polymers will soon allow future infrastructure to heal on its own, nano- and micro-materials are ushering in a new era of smart, super-strong and self-charging buildings.

Revolutionizing structural flexibility, carbon nanotubes are already dramatically increasing the strength-to-weight ratio of skyscrapers. 

But imagine if we could engineer buildings that could charge themselves… or better yet, produce energy for entire cities, seamlessly feeding energy to the grid.

Self-Powering Cities

As exponential technologies across energy and water burst onto the scene, self-charging cities are becoming today’s testing ground for a slew of green infrastructure pilots, promising a future of self-sufficient societies.

In line with new materials, one hot pursuit surrounds the creation of commercializable solar power-generating windows. 

In the past few years, several research teams have pioneered silicon nanoparticles to capture everyday light flowing through our windows. Little solar cells at the edges of windows then harvest this energy for ready use. 

Scientists at Michigan State, for instance, have developed novel “solar concentrators.” Capable of being layered over any window, these solar concentrators leverage non-visible wavelengths of light — near infrared and ultraviolet — pushing them to those solar cells embedded at the edge of each window panel.

Rendered entirely invisible, such solar cells could generate energy on almost any sun-facing screen, from electronic gadgets to glass patio doors to reflective skyscrapers.

And beyond self-charging windows, countless future city pilots have staked ambitious goals for solar panel farms and renewable energy targets.

Take Dubai’s “Strategic Plan 2021,” for instance.

Touting a multi-decade Dubai Clean Energy Strategy, Dubai aims to gradually derive 75 percent of its energy from clean sources by 2050.

With plans to launch the largest single-site solar project on the planet by 2030, boasting a projected capacity of 5,000 megawatts, Dubai further aims to derive 25 percent of its energy needs from solar power in the next decade.

And in the city’s “Strategic Plan 2021,” Dubai aims to soon:

  • 3D-print 25 percent of its buildings;
  • Make 25 percent of transit automated and driverless;
  • Install hundreds of artificial “trees,” all leveraging solar power and providing the city with free WiFi, info-mapping screens, and charging ports;
  • Integrate passenger drones capable of carrying individuals to public transit systems;
  • And drive forward countless designs of everything from underwater bio-desalination plants to smart meters and grids.

A global leader in green technologies and renewable energy, Dubai stands as a gleaming example that any environmental context can give rise to thriving and self-sufficient eco-powerhouses.

But Dubai is not alone, and others are quickly following suit.

Leading the pack of China’s 500 smart city pilots, Xiong’an New Area (near Beijing) aims to become a thriving economic zone powered by 100 percent clean electricity.

And just as of this December, 100 U.S. cities are committed and on their way to the same goal.

Cities as Living Organisms

As new materials forge ahead to create pliable and self-healing structures, green infrastructure technologies are exploding into a competitive marketplace.

Aided by plummeting costs, future cities will soon surround us with self-charging buildings, green city ecosystems, and urban residences that generate far more than they consume.

And as 5G communications networks, proliferating sensors and centralized AI hubs monitor and analyze every aspect of our urban environments, cities are fast becoming intelligent organisms, capable of seeing and responding to our data in real time.

Board of Directors | Board of Advisors | Strategic Leadership

Please keep me in mind as your Executive Coach, openings for Senior Executive Engagements, and Board of Director openings. If you hear of anything within your network that you think might be a positive fit, I’d so appreciate if you could send a heads up my way. Email me: [email protected] or Schedule a call: Cliff Locks

#BoardofDirectors #BoD #artificialintelligence #AI #innovation #HR #executive #business #CXO #CEO #CFO #CIO #executive #success #work #follow #leadership #corporate #office #Biotech Cleantech #entrepreneur #coaching #businessman #professional #excellence #development #motivation Contributors: Peter Diamandis and Clifford Locks #InvestmentCapitalGrowth

Start to Ask WHO, not HOW for Successful Project Implementation

Posted by Cliff Locks On July 3, 2019 at 10:06 am / In: Uncategorized

Start to Ask WHO, not HOW for Successful Project Implementation

When most entrepreneurs (including me) face a challenge, our first reaction is to ask: how do I solve this problem.”

As an Executive Strategic Coach I teach a powerful management shortcut for success.

Don’t ask “how.” Instead, ask “who.”

This blog explores that concept. Feel free to contact me when you need a “who” to seamlessly execute a project.

Start to Ask WHO, not HOW…

How much value are you leaving on the table because you don’t have a WHO or because you are caught in the minutia of implementing a project?

As entrepreneurs, each of us has a constant stream of ideas and new projects that might add massive value — if they ever get implemented.

Now, the idea is that as soon as I come up with an idea, my sole responsibility is to ask, “Who am I going to tag in to implement this project?” It has been an absolute game-changer.

Ultimately, asking WHO, not HOW, has transformed my ability to multiplex across my constantly increasing number of business ventures and projects.

Now if an idea comes to me during a moment of overload, I can still move it forward. I’ll spend 30 minutes creating an Impact Filter (a Strategic Coach client tool) explaining why the project is important, defining measurable criteria for success, and then hand that document to the right “who” in my ecosystem.

Simple enough, right? So why are we programmed to dive right into the HOW without thinking to ask WHO?

The Entrepreneur’s Dilemma…

As Dan Sullivan explained, “Our education system plays a major role in why we ask HOW and not WHO from the get-go. With the exception of a few exceptional schools, the education system is designed to prepare people for a life of ‘HOW.’

Kids in traditional classrooms around the world are graded on HOW they solve particular problems on their own. When you leave school, you need to collaborate and delegate to thrive. But in school, they don’t call it collaborating and delegating — they call it cheating.”

The education system engrains asking HOW and discourages asking WHO.

If you want to create a massive impact, you need to overcome old habits and begin to view human capital as an abundant resource. From there, curate a strong and passionate team to support you and act as your WHOs.

By delegating the HOW to my WHOs, my productivity and my overall passion go through the roof because I can remove myself from the mental weight and obligation of unfinished projects, allowing me to focus on what I truly love to do.

A final note for this section: you can even ask WHO when you build your team — go ahead and find yourself a WHO that finds WHOs!

Digitizing and Delocalizing WHOs

Over the past two decades, we’ve seen various forms of software emerge as the WHOs that figure out HOW.

I can verbally ask my phone (through Siri) ‘what is the GDP of Guatemala’, and Google serves as the WHO that executes the research task.

Before the advent of search engines, you’d have had to go to the library and do the research to find the right book, or you would have had to instruct an employee to travel and do that research for you.

Platforms and services like Amazon, Google and Baidu are all WHOs that entrepreneurs can tap to carry out the HOW.

In a similar vein, in a world soon to be electrified with gigabit connection speeds, entrepreneurs anywhere in the world can find their WHOs anywhere else in the world.

Eventually, our ultimate WHO will be an artificial intelligence software shell (think: Jarvis) that’s always on, always listening, always watching… always there to help and be the WHO for your every HOW.

Closing Thoughts

Finding your WHOs will make your HOWs happen faster and cheaper than ever before.

At the end of the day, while it’s really important for you as a leader to be smart, driven, ethical and visionary, the only way for you to scale your impact is to build an incredible team of WHOs behind you.

Right now is the greatest time in human history to find your WHOs. What are you waiting for?

Board of Directors | Board of Advisors | Strategic Leadership

Please keep me in mind as your Executive Coach, openings for Senior Executive Engagements, and Board of Director openings. If you hear of anything within your network that you think might be a positive fit, I’d so appreciate if you could send a heads up my way. Email me: [email protected] or Schedule a call: Cliff Locks

#projectmanagement #artificialintelligence #AI #innovation #HR #executive #business #CXO #CEO #CFO #CIO #executive #success #work #follow #leadership #corporate #office #luxury #entrepreneur #coaching #businessman #professional #aviation #excellence #development Contributor: Peter Diamandis #motivation #InvestmentCapitalGrowth

Training and Retooling a Dynamic Workforce Using AR and VR

Posted by Cliff Locks On April 10, 2019 at 10:08 am / In: Uncategorized

Training and Retooling a Dynamic Workforce Using AR and VR

As I often tell my clients, people generally remember only 10 percent of what we see, 20 percent of what we hear, and 30 percent of what we read…. But over a staggering 90 percent of what we do or experience.

By introducing gamification, immersive testing activities, and visually rich sensory environments, adult literacy platforms have a winning chance at scalability, retention and user persistence.

Beyond literacy, however, virtual and augmented reality have already begun disrupting the professional training market.

As projected by ABI Research, the enterprise VR training market is on track to exceed $6.3 billion in value by 2022.

Leading the charge, Walmart has already implemented VR across 200 Academy training centers, running over 45 modules and simulating everything from unusual customer requests to a Black Friday shopping rush.

Then in September of last year, Walmart committed to a 17,000-headset order of the Oculus Go to equip every U.S. Supercenter, neighborhood market, and discount store with VR-based employee training.

In the engineering world, Bell Helicopter is using VR to massively expedite development and testing of its latest aircraft, FCX-001. Partnering with Sector 5 Digital and HTC VIVE, Bell found it could concentrate a typical six-year aircraft design process into the course of six months, turning physical mockups into CAD-designed virtual replicas.

But beyond the design process itself, Bell is now one of a slew of companies pioneering VR pilot tests and simulations with real-world accuracy. Seated in a true-to-life virtual cockpit, pilots have now tested countless iterations of the FCX-001 in virtual flight, drawing directly onto the 3D model and enacting aircraft modifications in real-time.

And in an expansion of our virtual senses, several key players are already working on haptic feedback. In the case of VR flight, French company Go Touch VR is now partnering with software developer FlyInside on fingertip-mounted haptic tech for aviation.

Dramatically reducing time and trouble required for VR-testing pilots, they aim to give touch-based confirmation of every switch and dial activated on virtual flights, just as one would experience in a full-sized cockpit mockup. Replicating texture, stiffness and even the sensation of holding an object, these piloted devices contain a suite of actuators to simulate everything from a light touch to higher-pressured contact, all controlled by gaze and finger movements.

Learn Anything, Anytime, at Any Age

When it comes to other high-risk simulations, virtual and augmented reality have barely scratched the surface.

Firefighters can now combat virtual wildfires with new platforms like FLAIM Trainer or TargetSolutions. And thanks to the expansion of medical AR/VR services like 3D4Medical or Echopixel, surgeons might soon perform operations on annotated organs and magnified incision sites, speeding up reaction times and vastly improving precision.But perhaps most urgently, Virtual Reality will offer an immediate solution to today’s constant industry turnover and large-scale re-education demands.

VR educational facilities with exact replicas of anything from large industrial equipment to minute circuitry will soon give anyone a second chance at the 21st-century job market.

Want to become an electric, autonomous vehicle mechanic at age 44? Throw on a demonetized VR module and learn by doing, testing your prototype iterations at almost zero cost and with no risk of harming others.
Want to be a plasma physicist and play around with a virtual nuclear fusion reactor? Now you’ll be able to simulate results and test out different tweaks, logging Smart Educational Record credits in the process.

As tomorrow’s career model shifts from a “one-and-done graduate degree” to continuous lifelong education, professional VR-based re-education will allow for a continuous education loop, reducing the barrier to entry for anyone wanting to try their hand at a new industry.

Whether in pursuit of fundamental life skills, professional training, linguistic competence or specialized retooling, users of all ages, career paths, income brackets and goals are now encouraged to be students, no longer condemned to stagnancy.

As VR and artificial intelligence converge with demonetized mobile connectivity, we are finally witnessing an era in which no one will be left behind.

HR #leadership #business #CXO #CEO #CFO #Entrepreneur #WSJ #VC #socialmedia #Diversity #BigData #CorpGov #elearning #Marketing #Periscope #Recruiting #technology #startup #HRTech #Recruitment #sales #Healthcare #cloud #work

Please keep me in mind as your Executive Coach, openings for Senior Executive Engagements, and Board of Director openings. If you hear of anything within your network that you think might be a positive fit, I’d so appreciate if you could send a heads up my way. Email me: [email protected] or Schedule a call: Cliff Locks

Contributor: Peter Diamandis

Bringing artificial intelligence into your organization

Posted by Cliff Locks On April 3, 2019 at 10:04 am / In: Uncategorized

Bringing artificial intelligence into your organization


The goal is to help you think about the specific benefits of artificial intelligence and the areas you can consider automating, in your organization or area of responsibility. Here are examples of successfully deployed artificial intelligence applications. When you need help reach out to me, my contact information is on the bottom of this post.

AI tool helps companies detect expense account fraud.

Employers across a range of industries are using artificial intelligence in a bid to curb questionable write-offs hidden within employee expense reports, writes Angus Loten for WSJ Pro.

The cost of fraud. The Association of Certified Fraud Examiners, in a report last year, analyzed nearly 2,700 global employee-expense fraud cases detected over the previous year that resulted in $7 billion in losses.

AI-based fraud detection. AppZen offers an auditing tool that works with popular expense-management software packages such as SAP SE’s Concur or Chrome River Technologies Inc.‘s Expense tool. AppZen can scour 100% of employee expense reports, according to the company. The tool’s capabilities include computer vision that is able to read submitted receipts, deep learning that leverages training data to account for nuances or identify anomalies, and semantic analysis to organize objects and relationships, such as currencies, taxes and spend types.AI can speed, improve audits. Manual audits typically rely on only a random sampling of less than 10% of expense reports, allowing many erroneous or fraudulent claims to slip through undetected, says Anant Kale, AppZen’s chief executive. And while manual audits can take days or even weeks to complete, AppZen’s automated review takes only a few minutes to flag questionable items, the company says. These can include minor violations—such as accidental double entries for the same expense reported by separate employees, out-of-policy hotel mini-bar purchases or unapproved upgrades to first-class airline seats—to cases where outright fraud may be occurring.

Business Transformation

Foot Locker’s game plan to win over sneakerheads. Foot Locker Inc., spurred by growing market pressure to offer a higher degree of personalization and on-demand services, is aiming to integrate and gather data from across its operations—everything from website clicks to delivery preferences—and then apply algorithms to the data to quickly and accurately glean market intelligence, often in real time.

To do all of this, Pawan Verma, chief information and customer connectivity officer at the New York-based sports footwear retailer, has boosted the company’s tech staff roughly 30% over the past three years, while creating separate teams that work on data, apps, interfaces between apps and operating systems, artificial intelligence, augmented reality and machine learning. In an interview with WSJ Pro’s Angus Loten, Mr. Verma spoke about the challenges of turning a 45-year-old shoe retailer into an agile, tech-driven venture for Gen Z “sneaker freaks” and working with data and artificial intelligence.

WSJ: What are your biggest challenges working with data, AI and emerging digital capabilities?Mr. Verma: There are several areas, but a key one is around security. We are collecting billions of events and using machine-learning software to find a signal from noise. For example, when we have a product launch, such as Nike Air Force or Jordan Retro, billions of bots mimicking customers will try to render our websites and mobile apps useless by staging distributed-denial-of-service attacks on our internal and cloud infrastructure. This can drive customers away from the products they want and impact the social currency of our brand. We created tools, with some vendor partnerships, that deflect bot traffic and protect the site.

Robots

Using robots to comfort the lonely. Sue Karp, who was forced to retire early by a stroke and now lives alone, begins every day by greeting her robot companion, ElliQ. The robot greets her back. “I’ve got dogs, but they don’t exactly come up and say ‘Good morning’ in English,” says Ms. Karp.

Robots pals. Intuition Robotics’ ElliQ can ease senior loneliness, reports the WSJ’s Christopher Mims. Studies have found that loneliness is worse for health than obesity or inactivity, and is as lethal as smoking 15 cigarettes a day. It’s also an epidemic: A recent study from Cigna Corp. found that about half of Americans are lonely.

What ElliQ can do. ElliQ consists of a tablet, a pair of cameras and a small robot head on a post, capable of basic gestures like leaning in to indicate interest and leaning back to signal disengagement. ElliQ can also help its owner connect to family members. Through an app, ElliQ will prompt children and grandchildren to start video chats with their relative, send notes and links, and share photos.

Human-like responses. Unlike Amazon.com Inc.’s Alexa or similar voice-activated assistants, ElliQ is capable of spontaneous communication, has a wide variety of responses and behaves unpredictably. Its creators say this is essential to making it feel, if not alive, then at least present. It uses what its creators call cognitive AI to know when to interrupt with a suggestion—“Take your medicine”—and when to stay quiet, such as when a person has a visitor.Medicare Advantage might cover ElliQ. The robot is undergoing a trial with 100 participants conducted by researchers from Baycrest Health Sciences hospital in Toronto and the University of California San Francisco, at retirement communities in Palo Alto and Toronto, in part to verify that ElliQ alleviates feelings of loneliness. If so, the robot might be eligible for coverage under Medicare Advantage.

Human Capital

HR turns to artificial intelligence to speed recruiting. Human-resource departments are increasing turning to AI technologies that can help reduce the time to fill open positions, reports the Financial Times. Among the new tools:

• Machine learning devices that can go through huge numbers of applications to find candidates who match an employer’s needs.
• Chatbots that can answer candidate questions and help screen early-stage candidates.
• Video systems that can be used to interview candidates and can help determine if a recruit is confident or passionate and issues.

While some HR tech firms claim their tools are free of bias, that hasn’t proven to always be the case. The systems also need to be trained to effectively screen job candidates. And then there’s the human tendency to overuse new tech tools, which could lead HR to add new steps to their existing processes and extend the hiring process.

Work in the age of AI. Employees and employers have a different perspective on how AI will change the workplace, according to a report in the MIT Sloan Management Review. Workers appear ready to embrace the changes that are coming. More than 60% of workers, according to an Accenture study, have a positive view of the impact of AI on their work. Business leaders, on the other hand, believe that only about one-quarter of their workforce is prepared for AI adoption.

Come together. But common ground can be found. It begins with senior executives seeking clarity around talent gaps and figuring out which skills their workers need. From there, execs should look at how to advance those skills for human-AI collaboration.A different way to view the world. This calls for a new way of looking at business. First, employers and employees must show each other that they’re willing to adapt to a workplace built around people and intelligent machines. Second, worker education needs to embrace smart technologies to speed learning, expand thinking and bring out latent intelligence. And third, both parties must be motivated to learn and adapt.

#artificial intelligence #AI #innovation #HR #executive #business #CXO #CEOo #executive #success #work #follow #leadership #travel #corporate #office #luxury #entrepreneur #coaching #businessman #professional #aviation #excellence #development #motivation

Please keep me in mind as your Executive Coach, openings for Senior Executive Engagements, and Board of Director openings. If you hear of anything within your network that you think might be a positive fit, I’d so appreciate if you could send a heads up my way. Email me: [email protected] or Schedule a call: Cliff Locks

Contributor: Peter Diamandis