Short term: Trust vs Distrust – What is at stake for brands, corporations and executives? Guest Post by @Olivcim

In my eyes and in the wake of the disruptive digital force shaping our present and near future, building and nurturing trust and influence with all your stakeholders is one of the very key issues that any executive should put first on his/her strategic agenda. As a crystal-clear proof, take a look at The Edelman Trust Barometer, which has been substantiating this critical trend for almost 15 years. In many countries and not only Western, there is a sustainable and growing distrust towards politicians, media but also companies and anyone else embodying the “establishment” or the official “knowledge”. With its lowered technological barriers and the ease of creating transnational relationships, brands and corporations live in a networked society that totally reshuffles the cards at a blistering pace. Citizens can get access to much more information than they were allowed in the past. As a direct consequence, it also raises awareness about the on-going issues and makes things trickier to hide or manipulate.

This assessment matters for corporate brands as well as product brands. People no longer take things for granted when a brand or a corporation speaks. People require to be heard and involved in projects that impact their own lives, backyards or aspirations. People expect to be part of the solution. As an example, you can refer to palm oil. People are increasingly paying attention to social and environmental topics. Consumer goods firm Unilever, acting on the demands of tens of thousands of consumers, is committed to purchasing all of its palm oil from sustainably produced sources by the end of this year (2015). And if you try to fiddle, the likelihood of being caught in the act is higher and higher. Especially by activists and NGO who are at the cutting edge of digital connectivity.

A few years ago, fostering trust was probably a bit simpler as the main intermediary was the journalist. The latter has become suspect for many reasons. Among them, narrative bias, sensationalist reporting or complacency with the mightiest (politicians, corporations, governments, etc.) are often fiercely criticized. This is why nowadays it is not enough to only focus on them although media relations remain pivotal in a strategy. You must talk and actively listen to NGOs, groups of interest, regulators, employees or anybody concerned by your activities. If you don’t do that, you put your reputation at risk and might trigger distrust against your activities.

The incredible Volkswagen fraud story provides a relevant case study. For several years, this company has hammered strong messages about “clean diesel” at the corporate and brand levels towards consumers; but also to their own collaborators and the various stakeholders in the markets in which they’ve been operating. It turns eventually out that the company cheated on purpose by using a specific software reducing gas emissions on demand during approval tests. Despite millions and millions spent on advertising and public relations, it shows that cosmetic communication is pointless. Even worse, it generates distrust at the end of the day. And today, the German car manufacturer has to fight not only against justice, regulators and media but also car dealers, car owners, NGOs, class action groups who loudly express their concerns.

Nowadays, almost anybody is able to know something and unveil it all over the world through social networks, online petitions or even whistleblowing platforms because they want to call to action. From now on, the challenge is therefore to restore the damaged trust and reputation of the company by acknowledging what needs to be said, by taking concrete actions to abide by the laws but also by proactively listening to the concerned stakeholders and meeting some of their requirements. It will take time and money but there is no loophole. Sacking the CEO was a good first decision for Volkswagen, but the controversy is far from being over. Today, they are under close scrutiny from whoever is concerned. They will have to make the right decisions leading to a refurbished but trustworthy reputation. And I bet my two cents that similar stories will occur at other companies if distrust remains at these high levels. The winners will be those inspiring trust by leveraging a smart dialogue with their stakeholders.

Olivier-Cimeliere futureproofOlivier Cimelière
CEO Heuristik Communications, a consultancy based in Paris
Author of “Le Blog du Communicant” (in French)

Disruption Via Big Data Analytics – Guest Post By Lutz Finger

Disruption is a big word. I see many startups that want to be ‘disruptive’. But very rarely we see disruption happening suddenly. Big changes do not happen over night. A lot of industries are about to change due to data. We saw in the last 10 years a lot of advancements in the way how we store and process information. This has created the potential for change by predicting the future. Data by itself is useless but using pattern many companies aim to improve their business or to create even a disruptive idea. For disruption to happen we will need two main parts:

  1. Data is king. Yes, the internet has changed the retail industry but more important it has created a competitive barrier to enter. The world will be soon divided between companies who have data and companies who have not. We will see private equity funds coming that focus only on buying data-heavy assets as well as the revival of the utility companies as they discover their data assets. (Boring is the new sexy)
    Lutz Finger Big Data Analytics
  2. Mindshift – or an broken and ineffective system. Incumbents often have a great headstart with data, but they do not use it – thus others do. Why did google buy Nest? To get data access into your house. Why didn’t the Utility companies install smart devices at the home? They did not know that they could have this data or that data can be useful.

The Media industry has changed! The retail industry has change! The Education sector is changing! Which industry will be next? Looking at VC money. The healthcare sector. It fits the structure: there is a lot of data and a broken system.

By Lutz Finger

@LutzFingerlutzfinger.com

Lutz Finger futureproofLUTZ FINGER is Data Scientist in residence at Cornell University and author of the book “Ask Measure Learn”. He is an authority on data analytics and teaches at Harvard Business School a course about Data Driven Thinking. As director at LinkedIn he oversees internal data products as well as LinkedIn’s Economic Graph Challenge.

LUTZ is a highly regarded technology executive and a popular public speaker on business analytics. As co-founder and former CEO of Fisheye Analytics, a media data-mining company, he supported governments and NGOs with data insights. Fisheye Analytics was acquired by the WPP group.

He serves as an advisor at several data-centric corporations in the United States and publishes a Forbes Column. He has an MBA from INSEAD, as well as an MS in quantum physics from TU Berlin (Germany).

The Disruption of Energy Storage – Guest post by Brandon Ng

While press and overall hype on energy storage has increased notably – hardly an article on renewable energy is written without the mention of energy storage – energy storage is not an intrinsically new or novel concept. Pumped storage hydro (PSH) stations – where water is pumped up an incline to into a large reservoir or body of water when the supply of energy exceeds demand, and released through turbines when demand for energy exceeds supply – have been around since the late nineteenth-century.

Pumped Storage Hydro

The predominant issues with PSH stations are twofold:

  1. its economics only make sense when done at a grid-wide, utility scale; and
  2. in order for PSH stations to be cost competitive, they can only be developed in areas where the necessary topographical features – landscape height and availability of water – already exist.

Energy Storage Systems

As such, the ‘revolution’ in energy storage today isn’t its invention as much it is the widespread adoption of ‘micro’ energy storage systems (ESSs) without geographic limitations and/or by non-utilities: the industrial, commercial and residential sectors.

The most obvious and widely documented application of such ‘micro’ ESSs is in the continuous provision of energy generated by rooftop solar PV panels or small wind turbines. It could be said that both these forms of generating renewable energy are novelties the developed world, where grid electricity is for the most part, reliable and affordable (relative to incomes). However for many communities that lack access to grid-electricity altogether, usually for geopolitical or economic reasons, solar and/or wind-based power generation are the only cost effective means of electrification. It is in such communities where energy storage systems are both necessary and transformative in the provision of continuous electricity, one of the pillars of the modern first-world.

In parts of the world where reliable, grid-based electricity is prevalent, whether the market for ESSs shifts from utility-scale ESSs for power providers to ‘micro’ ESSs in the industrial, commercial and residential sectors remains to be seen. If the landscape for ESS products does change, two major factors are likely to cause such a shift: government policy and the stance which utilities take on innovation, be it on business model or on technology.

Intra-day Pricing

Take the concept of variable intra-day pricing of electricity for example: the notion that the value of electricity varies within a 24 hour day as a reflection of the ever changing relationship between supply and demand for electricity. Yet fixed-rate tariffs – often linked to prescriptive government policies – are still a pervasive (although not universal) feature across many major market segments. In implementing such price control mechanisms, the impetus to deploy energy generation or storage assets to match real time supply against demand shifts to the power providers.

If, on the other hand, the intra-day price of electricity is allowed to vary in correlation to its intrinsic value (as determined by market economics), then a case for grid-connected ESSs that are deployed by the end users themselves, may exist. Such ESSs would allow users to purchase power during off-peak periods when it is most economically favourable to do so, and consume (or even sell back to the utility operators) during peak periods, when it energy is most valuable. This process is called ‘time-shift arbitrage’. At risk of grossly oversimplifying a complex issue, the implementation of variable intra-day energy pricing in market segments where it is not already adopted is likely to be driven by government policies as a means of reducing energy consumption and the nation’s carbon footprint.

Market changes

A host of technical and structural market changes are of course, required before this can be universally realised, two obvious ones being:

  1. an increase in the overall price of energy (either due to high commodity prices or the pricing in of the environmental impact of energy generation); and/or
  2. either a decrease in the cost or an increase in the finite service life of today’s ESSs.

Great strides have been made by policy makers and energy storage companies on both, although the value proposition offered by ESSs as an enabler for time-shift arbitrage is still today, limited. However if current trends are an indication, this is the exciting reality we can reasonably expect to live in the near future. One where a broad, distributed network of micro ESSs are integrated into the fabric of modern, 21st century households to ultimately provide cheaper, more reliable and more sustainable power.

 

Brandon Ng futureproof energy storageBrandon Ng 
Co-founder, CEO at Ampd Energy
(rebranded from QFE)

Brandon’s profile on Linkedin

The Disruption Of Genomics – Guest Post by Dr Brad Worrall and Stephen R Williams

A decade ago, sequencing the over 3 billion bases of the human genome, which gave birth to the field of modern “Genomics”, was a disruptive force in science and technology that changed the way we think about disease forever. The development of this technology has led to the sequencing of 100+ eukaryotic and prokaryotic species including not only homo sapien and close relatives such as the Neanderthal but also e.coli, HIV, mouse, chimp, and a host of other organisms. Today, we are so efficient at generating these data that sequencing a human genome in a couple of days is the norm and digital interpretations of these genomes have been performed tens of thousands of times over. While generating human genomic sequences, and to a lesser extent analyzing the digital readout of these genomes, has become commonplace, the impact of this information on society on the whole is just starting to be felt. The impact on society and specifically the field of healthcare is a building wave that will soon take shape as a disruptive force in clinical education, day-to-day practice, and the financial infrastructure behind our healthcare system.

Individual sequencing

For an individual to have his or her genome sequenced today the cost is ~$1000 USD with the cost per “base” falling every day as this technology becomes more and more ubiquitous. This technology has the potential to benefit millions, if not billions, of people worldwide but as the cost drops individuals will undoubtedly come to the clinic at a higher and higher rate having their genome sequenced prior to the recommendation of any healthcare worker which will change the current paradigm of “have disease/syndrome get sequenced” to “already sequenced, what can this tell me about my disease risk?” The question is, ‘how will this disrupt the current way medicine is practiced and markets are branded?’

The academic conundrum

Currently, the average amount of hours that a typical medical school student spends training in genomics is limited, excluding those specializing as clinical geneticists. However, the field of genomics is so rapidly expanding that it is almost impossible for curricula to keep up despite the impact that this information will have on day-to-day jobs in medicine. The need for an uptick in education of clinical partners in healthcare has become ever important. These would include clinically certified geneticists, genetic counselors, medical geneticists, and bioinformaticians. The latter is an extremely under filled position that would benefit any and all healthcare institution in the years and decades to come as genomic information available in the clinic becomes more common place. However, are the decision-makers, both public and private, willing to make an investment in the short-term to prepare for the wave of genomic information that will inevitably hit the clinic? Also, do we as a community have the motivation to retroactively train individuals who are already in practice?

The wider impact of genomics

In chronic disease treatment there is an established model of the ‘case manager’. Initially a nurse reaches out to help a patient that has chronic disease. Next, the patient is assisted through the process of checkups, screenings, and treatment until the disease is not present or no longer manageable. The case manager leads this process. As genomics becomes more and more disruptive would our healthcare system benefit from a genetic information manager? How will individuals process this information outside of contact with a genetic counselor (someone who is dedicated to this) who is typically called upon by referral? Should this information be processed in a family-wise fashion? One’s genomic information affects everyone in their family tree and these dynamics can be complicated and outside the training of today’s genetic and genomic specialists. Further, who pays for this “pre-disease” counseling which could be both emotional and biological?

Dropping costs

As costs drop and individuals seek out this information on their own marketing, branding, and drug-genome interaction awareness becomes important. We already know that pharma-genomic information is profoundly important. For example Azathiaprine, given to an individuals with a specific genetic variant, is associated with a lethal side effect. In fact it is considered malpractice to not check the status of enzyme function by genetic testing of TMPT. The HIV drug Abacavir works well except for the fact that it will kill some individuals carrying a specific genetic variant. In unselected patients, 5-8% develop a potentially deadly hypersensitivity reaction within the first 6 weeks of antiretroviral therapy. Prospectively screening individuals for the risk variants (HL-B*5701 status) prior to starting therapy costs ~$17/person and avoids the far more expensive and deadly hypersensitivity reactions in more than 500 people for every 10,000 treated. Clinical practice and insurance coverage rapidly included this step. This has been hailed as model for adoption of a pharmacogenomics test.

Beneficial effects

These types of interactions will encourage pharma companies to develop a genetic test at the same time as the drug where the standard of care may be to test for the genetic variant along with prescription. Further genetic and genomics studies can actually help companies directly market to consumers as the individual becomes more intimate with their genome. This will cut down marketing to target by genotype, could potentially, avoid life threatening side effects and liability

Financials of Genomics

This leads us to the financial aspect of genomics that will certainly be a disruptive force in healthcare. Even though the costs of generating genomic information are at an all-time low, because of the relatively few diseases that can be specifically diagnosed from a genomic test insurance companies have not bought into the idea of covering the sequence of each individual genome as a good practice in preventive medicine. Thus, the out of pocket cost for the patient remains high. So, this leaves us with another question, ‘What is the cost/benefit of having healthy people receive their genomic information given the lack of understanding of the diseases that cause the most healthcare burden worldwide (ie vascular disease, cancer, Alzheimer’s, diabetes)?

The ethical issues

As with any technological innovation, the genomic revolution and implementation of genomics in clinical care has raised a slew of ethical issues related to genetic privacy (who can and cannot have access to genomic data – patients, families, spouses, employers), ownership (who controls what can and cannot be done with my genetic information – research, commercialization, public health), impact on family members (does the fact that my genetic information has relevance to my relatives give them any rights to know or not to know), and perhaps most acutely, the right not to know or to change my mind about knowing. On a very small scale, the story of the discovery of the gene implicated in Huntington disease has relevance. Prior to the genetic variant being known, ¾ of at risk individuals claimed a desire to know their genetic status, but once the test became available fewer than ¼ have chosen to get testing. The ability to decide whether to know or not may be more important to many than the actual knowing.

The legal / patent issue

Genomics provides ample opportunity for branding which we already see. Companies, hospitals, and clinics position themselves at the vanguard claiming cutting edge practice and innovation. A quick perusal of the New York Times Magazine demonstrates multiple healthcare systems are touting their use of genomics to target cancer treatment, tailor therapy, and identify risk. On the other hand, broadly available genome-wide data has substantial implications for companies that patented genomic information (e.g. Myriad Genomics for BRCA1), a controversial practice. Both the United States Supreme Court in 2013 and the Australian High Court in 2015 ruled that naturally occurring DNA sequences are ineligible for patents. In the European Union, some genomic sequences can still be patented, but with specific criteria. Nearly all Latin American countries have banned patents on genetic sequences. The situation in Asia is less clear. Broad availability of whole genome data will undoubtedly challenge the tenability of at least diagnostic genomic patents on a practical level.

The future prospects…

What are the avenues that excite us? The science driving this disruptive force continues to evolve and change. We have gone from individual genetic tests, the genome-wide association studies, to exome sequencing (cheaper but incomplete way to get genetic information), whole genome sequencing, and now a raft of other -omics (epigenomics, proteomics, metabolomics, and metagenomics) will be added to the mix and interact with genetic information creating an exponential growth of information. Under the current standard for clinical investigation, physicians and other practitioners tend to take a stepwise process going from more focused and targeted testing to broader methods. Broad scale availability of whole genome sequencing at an attainable price, will upend this process and may in fact eliminate other intermediary technologies.

 

Brad Worrall genomics FutureproofBradford B. Worrall, MD, MSc
Harrison Distinguished Teaching Professor and
Vice-Chair for Clinical Research of Neurology
and Professor of Public Health Sciences
University of Virginia

Stephen R Williams futureproofStephen R. Williams, PhD
Assistant Research Professor
Department of Neurology
University of Virginia​

 

How Disruptive Is Climate Change? Guest post by Giles Gibbons

Let’s start with a definition of Climate Change: for me, I describe it as the reduction of resources used by a company in order to create a sustainable business. As such, I think that Climate Change is not a disruptive force, at least not as a standalone item. It is only when it’s linked to other forces that the disruption occurs. For instance, Uber is a tech disruptor first that comes with a massive impact on climate change. Over the years, we have seen that attitudes with regard to climate change have been very slow in evolving. Recycling has grown in a fairly uniform way over the past 15-20 years. Yes, we see climate change entering into the psyche and business plans, but it is not a disruptive manner. If there is disruption, it comes because Silicone Valley companies have disrupted an industry and, as a by-product, there has been an impact on climate change. If climate change is about using ever less resources in a more astute and efficient manner, it is an ever-present challenge for businesses. We are not just referring to saving environmental resources; but, we are looking at reducing water bills, electricity costs, etc. It’s an evolutionary and ongoing business challenge.

When we look at the Silicone Valley innovations that have made such waves in various industries, the business models are being disrupted and, in many cases, we have been receiving a major windfall for climate change. The point is that, while many of these initiatives are climate change positive, very few start out with Climate Change as the disrupting fuse. [my term!] Maybe Elon Musk’s Tesla is the best counter example. However, for the most part, the corollary is that there is a lot more money floating around for investments in social impact.

On the side of CSR, employees and clients are both aware. However, it is rarely critical to the decision-making. CSR is a component, but an ever reducing part, in our experience. At Nike, for example, it is the CFO who has taken charge of CSR, where it is all about driving a sustainable business. The equation leads with resource and cost reduction. Afterwards, such actions contribute to a positive image for the company. I like to say that CSR provides a broad hue from the consumers’ perspective.

To the extent I am positive about the human being’s power to solve problems, I think that we will come up with solutions for the challenges of climate change and finite natural resources. For example, we see that solar power energy will become two times more powerful within the next 24 months. With the same timeframe, energy storage will become 10x more effective. These advancements will of course bring with them other challenges, but from a business perspective the need to reduce energy costs will remain similarly pertinent over the long term.

Bottom line, we are already on a positive journey with regard to climate change. There will be iconic moments that may alter and shape the narrative, but the need to adapt to climate change is old news. Any potential disruption will occur first and foremost for businesses that are directly in energy. Secondly, there are complementary businesses — such as transportation — that rely heavily on the consumption of these energies.

Giles Gibbons futureproof climate changeGiles Gibbons, CEO and Founder of Good Business, author of “Good Business: Your World Needs You” (on Amazon)

The Technology Based Restaurant – Guest post by Giles Morgan

The low hum was faint at first, slowly getting louder the more you concentrated on the sound, until eventually the black object broke over the horizon at first looking like a Blackhawk helicopter from a Hollywood blockbuster movie, then as it moved in a regimented fashion up and down the vines, its identity was revealed as a drone. Creating wine used to be about experience, gut-instinct and soil to name a few things guaranteeing a successful crop. Now its sensors, data, software and of course drones. Fitted with multispectral and visual sensors, these Drones collect multitudes of data determining the health of the vineyard (e.g. Crop Vigor) whilst on the ground, other sensors monitor temperature and soil. When the process gets to the bottling stage, an NFC (near-field communication) label is produced and placed on the bottle to be read throughout the supply chain from producer to logistics through customs and then to wholesale and finally the restaurant.

The restauranteur checks her cloud-based online platform and sees a visual representation of her food and wine stock supply chain. Three days previous, she had ordered a resupply of exotic ingredients for her head chef and can see where those supplies are in real-time on her interactive map. A warning notification pops up on her SiteSage Energy & Asset Management system alerting her that the temperature in one of the freezers in the restaurant kitchen is decreasing and that she has three hours to move supplies to another unit. Given that Restaurants use up to three times more energy than traditional commercial enterprises, the restauranteur is pleased to see that she has saved 10% energy in the last quarter. Sensors in the fridges allow her to see that there is enough spare capacity in the remaining fridges and organises for the food to be moved.

The restaurant manager arrives and fires up his Microsoft MICROS mTablet. He views analytical data on the previous day’s trading through the EPOS system. He then checks out the real-time view of the restaurant bookings for the day and sees several tables available during the first cover, at the click of a button, he pushes a marketing campaign into the Open Table mobile application to drive bookings. He analyses the evening reservations and wait lists, as well as the sales and inventory data. He uses the insights from this data to automatically update his staff coming in for the lunchtime session, a notification is also autonomously pushed to a group of temporary waitresses via the restaurants mobile employee app offering a shift this evening. The manager receives an almost instant notification from Sarah (a temporary waitress registered on the employee app) and the shift is confirmed back to her.

All the servers in the restaurant are equipped with mobile devices that can send orders directly to the kitchen. The craft beer and wine bottles (including the wine bottles from the vineyard) arrive and are lined up in the bar. The restaurant uses SteadyServ iKeg for managing their array of beers. Sensors attached to each keg tracks the type and style of beer, when it was delivered to the restaurant, when it was opened and of course when it will run dry. Spirits in the restaurant use smart spouts from BarVision that helps provide data insight from each pour. Everything is precisely monitored and integrated with EPOS (Electronic Point of Sales) systems.

On arrival from the logistics courier, each box is scanned and the data is uploaded to the cloud. Instant personalised emails are sent to relevant customers booked in for today offering them the chance to pre-order their favourite beer or wine before arriving, the drink list is automatically produced for the bar tender so that a seamless customer experience is created. A further email will be automatically sent to the customer after their evening offering them the opportunity to purchase a case of the wine or craft beer they enjoyed.

The doors open for lunch; every day when the doors open a bot fires a tweet to all the restaurants followers on Twitter with a link to today’s menu and a few special offers targeted at filling unsold tables over the next 7 days.

Customers start to arrive and as usual, people change their mind around which tables to sit at, the waitress checks her mobile tablet and with the swipe of a finger moves the customer to another table automatically rearranging the tables which don’t already have customers.

The couple sit down and use their mobile phones and apps such as Secret DJ to request their favourite music during their meal, requests are queued on the restaurants sound system and played in order, falling back on a playlist should there be a lack of requests.

Their drinks arrive immediately as they have already pre-ordered. The waitress hands the customers their mini slim line tablets which act as their menu’s. With every order, the menu is automatically updated removing customer disappointment if a particular item has sold out, in fact they won’t even know as it is silently removed from the screen.

According to Gartner, 6.4 Billion connected devices will be in use in 2016 with a staggering 5.5 million devices connecting every day and this will reach 20.8 billion devices by 2020. The Internet of Things (IoT) will support total services spending of $235 billion in 2016. Finally, by 2020 Gartner believes more than half of Major New Business Processes and Systems will incorporate some element of the Internet of Things.

Exciting times lie ahead with the use of IoT in many industries, that’s for sure. However, not without its dangers especially around security, and with so many devices connected comes the opportunity for hackers to use these devices to launch DDoS attacks which we have already seen in 2016. In addition to this, it’s important to remember the importance of having a robust management platform to monitor and support these large scale connected networks.

As a technologist and innovator, I’m excited by this revolution, if it also removes the disappointment of a corked bottle of wine or helps me discover new foods then even better. Chin! Chin!

 

Giles Morgan futureproofGiles Morgan,
Global Digital Leader |
Global TAS at EY ‘Misfit & Innovator’ M&A

​Diversity & Inclusion – Guest post by Michael Stuber

What should be seen as a business case and common sense turns out to be a long-lasting challenge for people and organisations

While differences have always existed in societies and certainly in business organisations, the phenomenon of diversity has become a disruptive force over the past 25 years. The end of the East-West-Divide, in combination with the emergence of the Internet, initiated not only the Third Industrial Revolution, but also a fundamental paradigm shift in the way many people live and work (together), at least in the Western world. Changes include an unprecedented growth in individuality (and hence diversity), a strong preference for multi-cultural environments (including the workplace), and multiple new ways of collaboration and communication. To that end, all levels of human cognition have been impacted, which provides huge opportunities for the business world but also challenges.

Reaping the disruptive value of Diversity

In order to realise benefits from diversity, the value-chain of Diversity & Inclusion needs to be managed carefully and ideally in a systematic way: Differences can only be turned into competitive advantage when openness prevails – individually and in the organisational culture – and inclusive processes, behaviour and communication are applied. The benefits of getting this value-creation process right have been proven by 205 robust studies portrayed in the International Business Case Report. Some studies highlight that in order for diversity to add value, a healthy conflict, e.g. through minority dissent, is required. This hint for existing challenges is only the tip of an iceberg, nowadays discussed under the headline of Unconscious Biases.

Hindering the productive disruption of Diversity

While the term ‘Unconscious Bias’ most often describes specific types of implicit associations, my analysis of existing research from the past decades suggests that it serves perfectly to describe six types of biases in three areas that have one thing in common: Making it hard for individuals, teams and organisations to tap into the potential of Diversity by consistently practicing Inclusion. The main categories of Unconscious Biases that are of immediate relevance to Diversity Management include personal / human preference for sameness, stereotypes about ‘others’, biased application of (theoretically) meritocratic processes, micro-inequities, unwritten rules in mono-cultures and the organisational preference that reproduces success types of the past. The dynamics can be observed on individual, process and organisational levels, and some biases stabilise each other in a way that makes mitigation a complex task.

Making Diversity & Inclusion work is complex

Over the past twenty years, a number of success formats dominated each of the different eras – each claiming to be the silver bullet everyone was looking for. In fact, the critical questions representing resistance against diversity, inclusion or both, have not changed much over the past decades. What’s in it for me? For the business? Why change in the first place? Is there any urgency at all? These and other common questions show quite clearly that a complex change strategy must be designed in order to nudge people and organisations towards overcoming initial and subsequent barriers, and gradually unleashing the power of differences. A combination of different change models has proven to be advisable: The generic trifold model of leadership, tools and cultural change serves as a backdrop against which more D&I specific approaches can be designed. The different types of Unconscious Biases provide another template for developing roadmaps. Multi-phase models for organisation development, such as Kotter’s 8 steps, make timing more effective. Finally, the value-creation model of D&I provides quality check points to know if your strategy will eventually lead to the desired benefits. One more thing still needs to be added to the complexity: Stakeholder management continues to be a challenge in many or most D&I processes. For the perceptions, personal convictions, needs and possibilities of different target groups and individuals within those target groups vary a lot.

Michael Stuber DiversityMichael Stuber,

Founder and Owner-Manager of European Diversity
VP of International Affairs, European Institute for Managing Diversity

 

The Power Of The Internet Of Things – Guest post by Jim Hunter (@theiotguru)

The term Internet of Things (or IoT) was coined in 1999.  Just what does this super generic moniker mean?  Literally, it means that physical devices are beginning to connect to the Internet.  If you think about that for a few seconds, technically that is what the Internet is. Physical devices connected together through a common network. The physical devices or “things” of the internet have been servers, routers, switches and all forms of connected compute devices. From that perspective, IoT, is the redefinition of what an internet thing is. More specifically, IoT is a redefinition of what a connected compute device is. The vast majority of theses new IoT connected computer devices will be sensors. Sensors that read, measure, collect and digitize the world around us.

Creating context

The reason sensors are so important is because they provide context. Today the most important devices to create context are those we carry with us. These include our mobile devices and in some cases wearable devices, that are loaded with sensors local to us.

Soon the most important devices will be those around us. In the near future, thousands of sensors will be fixtures in our environment that emit contextual data messages. These sensors will broadcast their contextual identifiers available that answer the questions of who, where, what, when and why to applications that are personal to you.  Sensors will measure and broadcast information about position, health, energy, radio strength climate, traffic, vibration, stress, noise, light … basically anything that can be measured and has value to mankind will be measured.  That measured information will be broadcast over a short distance to mobile devices that are within range.  For any given moment in time, a detailed digital picture of you and your surroundings can be captured for your private applications to consider.

IoT privacy issues

The personal and private nature of your data and identity require this design, as opposed to your device broadcasting its presence to other devices. This also means that creating informational infrastructure that works hand in hand with applications on personal devices is the future of the IoT, and a massive opportunity. This is not new. Essentially, this is how GPS location works. The GPS satellites broadcast small information messages that include their identity and the time. It is up to the personalized location devices to make those messages usable for a consumer. This design is also appearing in shopping situations, where stores broadcast location-specific RFID (Radio Frequency ID) to tell listening apps where products are in a store, and the application then converts those messages into product location information and related purchase deals for the user.

It is important to understand that there are a variety of ways to maintain security and privacy, even for the broadcasting RFID.  For example, a given ID could actually be an abstracted hash.  To make sense of an abstracted hash, the application would have to pass the hash to a decoder.  This decoder may be in a cloud or fog service.  The decoder may require authentication to decode the hash to the actual information pertaining to the RFID. This authenticated lookup also allows a given RFID, to decode into different information, depending on the authentication level of a given user.  For example, a teenager may get different information from an RFID than a parent or head of household may from the same RFID with the same application.  Most importantly, a given RFID may return completely different information, if decoded by completely different services.  For example, consider RFID 0088776655AB.  This RFID may decode to a value that results in displaying a fire hydrant for an emergency fire fighters Heads Up Display (HUD).  As well, the same RFID may decode to a no parking zone in a traffic application.  It may also decode as an obstruction warning for a person who is sight challenged.  Different apps can process the same IoT surroundings differently, which will enable a massive new wave of value add applications.

The above example, is just one of many of the value propositions of IoT.  With IoT, we are giving a voice to an unprecedented number of things that can measure every aspect of our world.  These things will provide context like we have never known before.  They will answer the questions of who, where, what, when and why.

jim hunter futureproof IOTJim Hunter,
Chief Scientist & Technology Evangelist at Greenwave Systems Inc.,
@theiotguru on Twitter

 

 

The Burgeoning Landscape of Localization Tools and Smartphones – Guest post by Anne Bezançon

For the first time in human history, every movement in the physical world can be identified, recorded and analyzed through our mobile phones, smartwatches, fitness trackers and soon smart-tattoos and implants. This fundamentally changes our relationship to ourselves and others because “digital” and “physical” are now merging, providing a completely new source of data and analysis, and ultimately a higher-fidelity representation of who we are.

The expected impact of localization tools

The impact of these capabilities goes way beyond their initial application to maps and navigation or even geofence marketing. We are exploring use-cases across all verticals: advertising (from attribution of campaigns to segmentation of audiences based on their behavior in the physical world), retail (consumer patterns, routes, frequency, times), healthcare (from tracking an infectious disease across the country to reminding folks of the proximity of a pharmacy to pick up their prescription), sports and entertainment (from performance tracking apps to geofencing communications within a venue for a few hours), news (local citizen journalism), financial services (fraud prevention by matching transaction to place), transportation and logistics/delivery (from Uber to Fedex, route optimization, user feedback in real-time), field service operations (team management for utilities), and, of course, public safety and government (managing refugee flows is a timely concern). From collecting anonymous data from millions of users and extracting statistical models of behavior to addressing one individual’s specific needs based on interactive systems, localization tools are going to change every sector of activity.

From a company’s perspective, what needs to be done to take advantage of it?

First, think about what you could know through these tools that you don’t today. Localization tools provide data that was not available before.

Second, research and identify the right partner who has the expertise and technology to help you.

Third, start doing something, iterate and learn. There are major opportunities for competitive edge with the right combination of one and two above.

What are some of the risks and opportunities?

The biggest risk lies in the need to define new boundaries and protection mechanisms for privacy, as regulators are still lagging behind with the very rapid evolution of technology, particularly in mobile, internet of things, etc.

The opportunities are many from a business standpoint, since more “picks and shovels” need to be built, either specialized in certain “vertical” problems and solutions, or specialized in back-end computation of increasingly large amounts of data, or yet in front end data visualization for both consumers and business decision makers.

——-

Anne_Bezancon_Placecast FutureproofAnne Bezançon is the founder and President of Placecast, the leading enterprise platform for monetizing mobile location and user data at scale. The company specializes in providing proven, secure, privacy-first solutions for big data monetization to the largest Telecom (AT&T, Rogers, Telefonica), Financial Institutions and Media companies in the world. Over 500 brands have used the Placecast platform, including Starbucks, Subway, HP, JetBlue, McDonald’s, and Pizza Hut.

A native of France, Anne discovered her passion for technology when she helped develop the Minitel, a precursor to the Internet. Anne moved to the Silicon Valley in 1996. She has since started three companies and participated in the launch of two more. In 1995, she organized the NGO Forum of the United Nations Conference on Women in Beijing, and pioneered private sponsorships from Apple and HP to enable training in word processing and email for 40,000 participants.

Anne was invited to meet with the French President during his official visit to San Francisco in March 2014. Anne was also named to the 2013 “Mobile Women to Watch” list from Mobile Marketer. In 2011, Anne attended the eG8 Summit, an invitation-only summit of leaders in government and industry focusing on the Internet in the context of global public policy. She writes thought leadership pieces for leading tech and business publications, including Forbes. Anne holds a diploma from Sciences-Po Paris, and an LLM in Business Law. She is the author of several patents in the field of location-based technology, and speaks frequently at various tech industry and business events.

The Car Of Tomorrow Inches Closer – Guest Post by Arthur Goldstuck

If one is looking for a barometer of the evolution of the motor vehicle, one of the best places to find it is in the rising pressure on car makers to display automotive technology at consumer tech shows.

While the likes of the Detroit, Geneva and Tokyo Motor Shows still dominate launches, unveilings and announcements, the technology breakthroughs are slowly moving across to the likes of the Consumer Electronics Show (CES) in Las Vegas and the Mobile World Congress (MWC) in Barcelona.

Marque by marque, the car makers are trying to find a place among the gadgets and smart devices. Twelve major manufacturers now make their way to CES as a matter of course. And one manufacturer did the unthinkable in 2016: launched a new vehicle at MWC.

The biggest news announced at CES 2016 was not of a technology but of a piece of paper. To be precise, a driving licence. But this was no ordinary licence.

The US state of Nevada, home of CES, awarded the world’s first test licence for autonomous driving of a standard production car. The real big news was that the beneficiary was not a futuristic concept car of the kind launched by Nissan at the Tokyo Motor Show in October or by Mercedes-Benz at last year’s CES.

Instead, it was the new Mercedes E-Class, with three standard production 2017 models given approval to drive themselves by the Nevada Department of Motor Vehicles (NDMV).

It is difficult to overstate the significance of this news. It means that the autonomous vehicle is no longer an experimental toy built by Google and operated by geeks. It means that vehicle software creators like Microsoft and BlackBerry no longer have to persuade manufacturers that this is their future. It means the likes of Ford, Audi and Volvo will also move on from elaborate adaptation of existing vehicles, modified steering and retro-fitting sensors.

As a Daimler statement put it, “The standard-production vehicle is already extensively equipped with intelligent technology. This means that, for testing purposes, it is necessary merely to make some smaller software modifications to the DRIVE PILOT control unit.”

But the robots have not taken over. Yet. For now, the self-driving cars will still need a trained test driver. While the tests are allowed on all highways in Nevada, human drivers have to take over for turning, merging and departing. The NDMV rules also require one passenger behind the wheel and a second passenger in the vehicle on test drives.

Meanwhile, the unveiling of a new vehicle at the MWC 2016 represented a seismic event in the history of both vehicle manufacture and consumer technology.

At the world’s largest expo devoted to mobile technology, the unveiling of the new Ford Kuga SUV marked the final crossing over of car technology into the preserve of consumer devices. While manufacturers have been showing off in-vehicle advances and self-driving possibilities at CES for around seven years, the event always played second fiddle in this arena to the Detroit Motor Show. Detroit would see the unveiling of the latest cars and tech, while Las Vegas would act as a showcase. Around six weeks later, MWC in Barcelona would find itself coughing in the exhaust smoke of the media bandwagon that had been and gone.

No more. The presence of the new Kuga at MWC was almost as significant as the keynote address by Mark Fields, CEO and president of Ford Motor Company, who declared the manufacturer’s repositioning as both an automotive and mobile business.

This repositioning is at the heart of the choice of tech show rather than a car show to launch a new vehicle. However, the Kuga also offers the most convincing evidence yet of the mainstream potential of the connected car.

It debuts the latest version of Ford’s on-board information and entertainment system, SYNC 3, which uses conversational voice commands to control audio, navigation, and climate functions. It also integrates seamlessly with most smartphones by supporting Apple CarPlay for iPhones and Android Auto for Android devices. It means that any app available on the handsets can appear on the vehicle’s 8-inch touchscreen display and be controlled from there.

This represents a dramatic breakthrough, particularly in the world of in-vehicle navigation, which traditionally confined drivers to the mapping systems that came with the cars – and were thus already obsolete when they rolled off the production lines. SYNC 3 allows the latest version of Google Maps, Apple Maps or HERE Maps, for example, to be called up from the phone.

The significance of this is that vehicles can, for the first time, take full advantage of the rapid evolution of mobile technology, apps and utilities. In the next few years, the technology will be rolled out to all new Ford vehicles, meaning that cars aimed at the mass market will enjoy the same information system advances as high-end vehicles.

The Kuga introduces new driver assistance technologies as well, serving as a precursor to autonomous or self-driving vehicles. The existing semi-autonomous Active Park Assist technology is joined by Perpendicular Parking functionality, which uses ultrasonic sensors to locate parking spaces and steer the vehicle into them. The driver still controls the accelerator and brake, but the hard work is taken over by the car.

Coming out of parking spaces also becomes safer through additional sensors. The Cross Traffic Alert uses a radar with a 40-metre range to warn drivers of vehicles approaching from either side.

The sensors, which are for now the key to making vehicles safer, will be at the heart of the future self-driving car. That means that the Kuga is not only a taste of the future, but also a proof-of-concept that will hasten the arrival of tomorrow’s car.

When it does arrive, as we are seeing in Nevada, it will also signal a new era of licensing authorities having to reeducate themselves on the capabilities and limitations of vehicles. But don’t expect that to happen overnight.

 

Arthur_Goldstuck futureproofArthur Goldstuck is founder of World Wide Worx and editor-in-chief of Gadget.co.za. Follow him on Twitter on @art2gee, and subscribe to his YouTube channel