Tuesday, September 26, 2017

The 6 As of the Uber Apology

As Uber turns the page with a new CEO — and with their high visibility and importance in the growing gig economy — I take a quick stab at an analysis of their recent apology regarding the denial of a transport license in London. I use an apology model I conceived a couple of years ago:

Acknowledging something has happened. “We’ve got things wrong” doesn’t tell us what. We need to know they know and understand. That said, it’s not always a good idea to highlight all the negatives. Score: 8.5/10

Authentic expression of regret. The language in the letter seems sincere and “We won’t be perfect” sounds like an honest admission. But it’s hard to know if Uber CEO Dara Khosrowshahi is truly remorseful if we don’t understand the transgression(s). Plus, this is Uber’s first crack at diplomacy after a history of confrontation. They need more of track record to gain additional trust. Score: 8/10

Appropriate tone and language. I don’t think saying “We will appeal this [negative] decision” helps here — it muddies the message and pulls us back toward the more familiar, pugnacious Uber. The rest of the letter, though, speaks reasonably in plain words. Score: 7.5/10

Acceptable venue. An open letter in this situation is fine but should be backed up by personal, private outreach. Score: 9/10

Acting in the right timeframe. This is moderately quick — Transport for London refused to renew license on Friday, Sep. 22. Score: 9/10

Announcing next steps. “We will listen to you” and talk of writing the “next chapter” gives us little information. There are references to advances in wheelchair accessibility and clean air but saying “we will work with London to make things right” doesn’t tell us what or when other issues will be addressed. Score: 7.5/10

The 6 As rubric weights the elements differently. So, my overall score — and yours may certainly differ — works out to 82/100. Overall, not great but it seems that it was good enough for London Mayor Sadiq Khan to ask for the parties to come back to the table for new talks.

Or, was it pressure from Uber’s petition? The apology, after all, starts with the line: “We want to thank everyone who uses Uber for your support over the last few days.” The petition has over 790K signatories (aka potential voters), as of this writing. Their apparent two-pronged strategy reminds me of Muhammad Ali’s classic line: “Float like a butterfly, sting like a bee.”

This article also appears in Medium.

Thursday, September 7, 2017

Who Can Win the Battle for Truth?

This article also appears in Medium.

We’re fighting over the truth in the news media, at home, at work, and in the halls of Congress. And the battle carries over to our institutions of higher education with sides taken over free speech and academic freedom.

President John F. Kennedy said, “The goal of education is the advancement of knowledge and the dissemination of truth.” So, it seems appropriate that Cornell University recently held a symposium entitled, “Universities and the Search for Truth.”1

Why does this all seem more urgent today? Humans have always been truth-challenged. Ancient conquerors frequently rewrote history. The Bible is filled with stories of deception. Some countries, institutions and industries exist on the clever use of propaganda.

The truth is the amount of what we call information is expanding wildly, and spread in more ways and with greater consequences than ever before. The rise of the internet and consumer-generated content, pressures on professional journalism, and our reliance on social media channels and their complex algorithms all influence what we see, hear, believe and share.

Add these modern issues to Friedrich Nietzsche’s declaration that “There are no facts, only interpretations” and we have a gray, goopy and potentially grave mess. Echoing the reality of truth’s plasticity, Professor David Shalloway at the symposium said, “Data can be true or false, but knowledge is usually only an approximation.” And Professor Holly Prigerson voiced a similar view: “Truth is not an absolute thing. It’s not binary, and it’s on a continuum.”

Our judicial system recognizes our inclination to manipulate the facts into a self-satisfying truth when we’re asked to “tell the truth, the whole truth and nothing but the truth.” As much as the words matter, though, Professor Sarah Murray said, “Language itself doesn’t ensure the truth or reliability of information. It’s how we use language and communication and who’s using the language that are judgments about that.” Most people understand this – the messenger can increase or decrease the credibility of the message.

While the literature carries many comparisons between strategic communication and war – offense and defense, knowing your opponent, hearts and minds, etc. – Professor Mor Naaman acknowledged his talk was particularly “dark” and “grim.” “Modern media technology is killing truth and knowledge,” he said. “Instead, our technology emphasizes only information and emotion.” He added that social media is a “well tuned and optimized machine that plays exactly” to our biologically, psychologically and evolutionarily wired sense of emotion, not truth or knowledge.

You can see how fear and anger are being used as platforms for persuasion but we can use this insight on emotion toward a more favorable purpose. The most effective, enduring way to communicate is to link fact and emotion through the use of examples, imagery and storytelling. And the language needs to be relevant; context is required. A famous wrongful death case involving drug side effects was lost well before the conclusion of all the testimony. “We didn’t know what the heck they were talking about,” a juror told The Wall Street Journal.2

Yet, a problem remains in how we receive information. We hear about algorithms making viewing choices for us – the creation of echo chambers. The algorithms are sometimes called filters but they are not. They curate but also isolate. They homogenize, not cross-fertilize.

The symposium panel offered some fixes: Educate students on the ethical, philosophical and social issues of technology; study how technology can create misinformation and biases; create new curricula, and focus new research on these issues. While important, they are long-term solutions and it’s unclear how the findings would be applied widely.

We need equal attention on smaller, shorter-term initiatives. So, let’s stipulate that the truth is subjective and focus instead on the starter material – the objective facts – since these are frequently denied or called into doubt. In addition to the earlier call for using relevant language in describing the facts and connecting these to resonant emotions, we should consider the following:

Push to end false equivalency and the conflation of opinion with fact. If 97 percent of climate scientists agree on human causes for climate change, we should not see one-on-one debates. News organizations and social media news feeds should present the available, accurate data but must differentiate between fact and opinion.

Overwhelm the bad with the good. More experts need to speak out and share the facts to help push inaccurate information down the internet search list. The Alan Alda Center for Communicating Science at Stony Brook University is one of a growing number of programs tackling how to communicate complex information in more understandable, relevant ways.

Get there first. False or misleading statements are terribly difficult to retract and, harder still, to erase from one’s memory. In a study of nearly 900 participants, researchers showed “the repetition of tentative news stories, even if they are subsequently disconfirmed, can assist in the creation of false memories in a substantial proportion of people.”3 The bottom line is that people may continue to rely on misinformation even when a subsequent retraction is made and remembered.

Use technology to advance real time fact checking. We can’t rely on a reporter’s memory or ability to interrupt a guest to check the facts. The idea for a “Truth Meter” was raised at the symposium and it was reported last week that two Penn State professors received a grant from the National Science Foundation to develop technology to identify and exclude “fake news” on digital platforms.4 If IBM's Watson computer can win at Jeopardy!, there's no reason that (nearly) real time fact checking couldn't be a reality. We should explore the potential for machines to sift through transcripts, proceedings and testimony; almanacs and atlases; laws, regulations and policy statements; credible survey data, and peer-reviewed research reports.

These efforts will be successful only if our institutions – and society at large – do more to promote and enforce honesty, and venerate intellectual exploration. I keep reading about our “hyper-connected world.” But in these connections we need hyper-vigilance for the facts. Perhaps then we’ll have an easier time searching for truth.


I invite you to follow me on Twitter @pauloestreicher.


1.     Cornell University. “Academic Symposium: Universities and the Search for Truth.” August 24, 2017. https://www.cornell.edu/video/academic-symposium-universities-search-for-truth.
2.     Tesoriero, H.W., et al., “Merck Loss Jolts Drug Giant, Industry,” The Wall Street Journal, August 22, 2005.
3.     Lewandowsky, S., et al., “Memory for Fact, Fiction, and Misinformation” (2005), Psychological Science, 16(3):190-195.
4.     Associated Press. “Penn State Professors Get Grant for ‘Fake News’ Detector.” August 31, 2017. https://apnews.com/e1b353e9e62f4e4096e419c13b061750/Professors-get-$300,000-grant-for-digital-fake-news-detector.