Microsoft's Bing is irresponsible with suicide

Reading time icon 5 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

November is typically mental health awareness month, one where men around the world grow out their moustaches in order to draw awareness to the troubling trend of male suicide.  To provide perspective from a tech angle, there’s an aspect of this that is being overlooked here: the culpability of now ubiquitous, ever-present search engines like Google, and Bing in the process, and whether their parent companies can do more.

To give credit where credit is due, Microsoft has done some work here. If one searches “How to kill – ” the autocomplete stops there. Upon searching “How to kill yourself”, Microsoft surfaces the hotline number for the Samaritans in the UK, urging users to call the hotline to get help. If you search “suicide”, or “how to commit suicide”, Microsoft floods the searcher with helpful resources aimed at getting the searcher to get help. All of this is good and should rightfully be applauded.

However, despite all that, Microsoft’s Bing search isn’t doing enough in this area, and often provides “helpful” results, especially compared to their main competitor — Google.

To illustrate, a Bing search for “How to kill yourself” has the 3/5 of the first results either urging to commit suicide or providing a guide with methods, while the other two results are message boards. A click on the results surfaced on the page confirm that these results are indeed what the searcher might want, though not in the way Microsoft probably intends, and not what they need. The equivalent Google search, by way of contrast, has the very first search result as a link to an NHS’s web-page, the second result link to self-help, and then a questionable few results followed by help resources all on the first page.

More worryingly than the results, Microsoft also offers you alternative searches, just in case your first few results weren’t too helpful. Bing seems to ask the user: “How to kill yourself” not giving you the results you wanted? how about “How to poison yourself”, “painless ways of suicide” or “best way to hang yourself?”. Google, once more, doesn’t offer up any alternative suggestions.

To illustrate how this undermines their earlier efforts, If you search “how to commit suicide”, Microsoft helpfully fills the page with resources. To the top right of the screen, however, Bing offers alternative searches. “Best tablets for suicide”, “pills to take for suicide.” If one makes a typo or slight misspelling, Microsoft oddly doesn’t offer up any of the helpful resources even while recognising that the search term is most likely incorrect, while Google still does. There are a few more examples I could give, but the point should be clear. Microsoft is consistently failing to do put in the extra work necessary to protect vulnerable users of its Bing search engine.

Search engines are powerful in this day and age, and can and have been manipulated to push certain results over others. Microsoft, Google, Apple and all know and understand this power, and that is why they have instituted safeguards on the content of certain results, especially those relating to suicide and self-harm.

The Samaritans, a suicide prevention charity most prominent in the UK, shared some thoughts on Suicide and the online world back in 2013.

We believe that search engine providers have a corporate social responsibility to ensure that credible sources of support are promoted when people enter suicide-related terms into search engines.

There is also more that could be done to reduce unhelpful ‘auto complete’ functions on partially-entered search terms.

We have good relationships with some search engine providers but there is more that they, and we, can do on this. There are also some who have not really stepped up to the plate on this issue.

While Microsoft has gone on to add their hotline to suicide-related links, as stated above, their provision of alternative search terms is one which could be constructed to be equivalent to unhelpful autocomplete search terms which the charity calls out above.

There’s a discussion that can be had with regards to the concept of censorship, but that is surely nonsense. Free speech is not — especially in this case — commitment to a suicide pact. Search engines can, of course, still surface the offending results, but promoting them on the first page is profoundly unhelpful.  Microsoft shouldn’t be promoting methods to kill yourself, or suggesting you alternative searches for better results.  Its algorithms can be tweaked and changed to promote more ideal results, and the firm should think harder about how its results actually affect people.

Search Engine results may seem particularly trivial and low-hanging fruit, but research has shown that suicide is a very impulsive thing which in many cases, can be prevented by the barriers, even trivial ones. A New York Times article on the erection of suicide barriers in Golden Gate bridge (that you should read) cites the work of academic researchers which support this.

Dr. Blaustein said, “The most common myth to explode is that people will go elsewhere.”

In a 1978 study, “Where Are They Now?” Richard H. Seiden, a former professor at the University of California, Berkeley, School of Public Health, looked at the question of whether someone prevented from committing suicide in one place would go somewhere else. He studied people who attempted suicide off the Golden Gate Bridge from 1937 to 1971 and found that more than 90 percent were still alive in 1978 or had died of natural causes.

Microsoft’s barriers need to be strengthened. As Satya Nadella noted earlier this year, technology should be built with empathy. A good lesson, that needs to be applied.

More about the topics: bing, google, microsoft, search, search terms