The Call Center

View Of Staff In Busy Customer Service Department

In 2001, I wrote Net Attitude: What It Is, How to Get It, and Why Your Company Can’t Survive Without It. The focus was to share with organizational leaders how they could use a “net attitude” to make their organization more successful by using the web to be more responsive to customers and constituents. The classic example of a poor net attitude is often one of the many call centers, which I wrote about almost 20 years ago, and many have still not improved. The pre-recorded voice greeting begins with, “Please pay attention because our menus have recently changed.” How is it the menus of all call centers have recently changed. If you press “0”, you often get a response saying, “You have pressed an invalid key”. “Please enter your 16-digit account number” is often followed by a person in the call center, right after you entered the 16-digit number, asking, “What is your account number?” These annoying, often repetitive responses are not due to a technical problem. They are due to a lack of net attitude.

Smartphone vendors and online retailers strive to outdo each other. The top tier of them have customer satisfaction percentages in the range of 75 to 90. On the other hand, the leading cable companies hover in the range of 55 to 60.[i]  One of the key elements in the difference is the call centers.

In early 2019, I had a question to ask Comcast technical support. I tried my best but eventually gave up trying to reach them. It wasn’t the hold time; it was the attitude they deploy. I called the main number. The first prompt said to press 1 if the call was about USC. I have no idea what USC means. University of Southern California? I was next prompted for my reason in calling. I said “technical support”. I then listened to 15 seconds of clicking sounds which I was supposed to think was an agent typing my request on a keyboard. Do they think anyone would believe that? Next, I was asked to press 1 for Slow Internet, 2 for Connection Problems, 3 for Wi-Fi Password, or 4 for Email trouble. There was no other choice. If I did not select one of those four choices, I could not proceed. I pressed “0” hoping to get to a person, and the call center hung up. Cable companies face a number of issues related to pricing and contractual terms but I belive annoying call centers contribute to the frustration and poor ratings.

According to Site Selection Group, a Dallas, TX, and Greenville, SC provider of global location advice, there are 7,400 call centers in the U.S. employing more than three million people.[ii] John McCormick at the Wall Street Journal, described how Cogito, a Boston based augmented intelligence company, is using voice AI to make call centers more effective. McCormick explained,  

As calls come into a center, they are streamed to Cogito’s system, which evaluates hundreds of data points including speech rate, tone and more. If agents are pausing before answering questions, it could indicate they’re distracted. If customers raise their voices, it could be a sign of frustration. When the Cogito system detects a possible issue with a call, it sends a notification in the form of an icon or short message to the staffer’s screen. It is a suggestion that the agent recognize and acknowledge the caller’s feelings.[iii]

The Cogito system in effect coaches the call center agents to help them become more confident, engaged, and empathetic. McCormick quoted the Cogito CEO as saying, “Learning to speak to different customers is a real skill. You’re not born with it. You have to learn it.”[iv]

Some insurance companies have found the Cogito AI helpful in improving first call problem resolution as much as 10% and customer satisfaction, but it clearly has a long way to go. I believe, over the next few years, we will find an AI will be able to learn how to satisfy customers better and faster than human agents. By applying machine learning to a very large number of customer calls and matching the nature of the problems with solutions which solved problems in the past will enable much higher accuracy than a human can achieve. We can look forward to AI-created voices which sound like a human and an AI which can understand the questions we ask without asking us to press 1 for this and 2 for that. 


[i] “Benchmarks by Company,”  American Customer Satisfaction Index (2018), https://www.theacsi.org/index.php?option=com_content&view=article&id=149&catid=&Itemid=214&c=Comcast&i=Subscription+Television+Service

[ii] “Strategic Locations Solutions,”  Site Selection Group (2019), https://www.siteselectiongroup.com/site-selection-group-about

[iii] McCormick, “What Ai Can Tell from Listening to You”.

[iv] Ibid.

Tagged with: ,

Robot Attitude in Sun and Surf

Robot Attitude in Sun and Surf Magazine

Get a small glimpse of my upcoming book, Robot Attitude: How Robots and AI Will Make Our Lives Better in the April issue of Sun and Surf Magazine. This is my third article in an issue of Sun and Surf. Click here to enjoy the entire magazine, and flip to page 18 to see my article in a larger size. I hope you like it. If you would like an email when the book is published, just enter your email below and click “Let Me Know”. Your email will not be used for any other purpose.

Let Me Know When Robot Attitude is Published

* indicates required

Tagged with: ,

Your Voice Diagnoses Your Heart

In the United States, about 610,000 people die of heart disease every year, accounting for one in every four deaths.[i] Heart disease is the leading cause of death for both men and women.[ii] Dr. Amir Lerman, director of the Cardiovascular Research Center at Mayo has focused his research on coronary physiology. He conducted a two-year study ending in February 2017 to see if voice analysis was capable of detecting coronary-artery disease. Dr. Lerman said, “Every person’s voice has different frequencies that can be analysed.”[iii]  

Mayo has been collaborating with Beyond Verbal, a Tel Aviv, Israel, and Newton, MA based company which is developing voice-enabled AI to create voice biomarkers. Biomarkers are normally thought of as biological molecules found in tissues, blood, or other body fluids which indicate a normal or abnormal bodily process or the presence of a specific condition or disease. Researchers at Beyond Verbal have discovered the human voice can also be a biomarker, specifically for coronary artery disease. The company believes the voice biomarkers can be used for personalized healthcare screening and continuous remote monitoring of heart health.[iv] In collaboration with Mayo, Beyond Verbal, used machine learning to identify specific voice biomarkers, and then tested groups of people who were scheduled to get angiograms.

Participants in the study recorded their voices using a smartphone app provided by Beyond Verbal. An analysis of the voices showed patients who had evidence of coronary-artery disease based on their angiograms also displayed voice biomarkers which could show presence of the disease.[v] The implication of the study is heart disease can be detected without intrusive and expensive tests. The combination of a patient’s voice plus AI will be able to do the job.


[i] “Heart Disease Facts,”  Centers for Disease Control and Prevention (2019), https://www.cdc.gov/heartdisease/facts.htm

[ii] Ibid.

[iii] “Amir Lerman, M.D.,”  Mayo Clinic (2019), https://www.mayo.edu/research/faculty/lerman-amir-m-d/bio-00078051

[iv] “Beyond Verbal,”  Beyond Verbal (2019), https://beyondverbal.com/

[v] McCormick, “What Ai Can Tell from Listening to You”.

Tagged with: , , ,

Your Voice Tells It All

If a friend or relative calls you on the phone, and something is wrong in his or her life, you can tell immediately. Could an AI tell also? Yes, and a whole lot more. Using the same machine learning technology an AI can use to tell a cat from a dog or recognize a person’s face, an AI could recognize your voice. An AI with access to a database containing a large number of your voice samples along with a description of whose voice each sample is associated with, the AI can recognize you. Based on characteristics of the voice and what state of mind those characteristics are associated with, it can also know if you are not feeling well, are upset about something, or in a big hurry. Some of the characteristics of voice data which can be detected include tone, tempo, volume, language and dialect used, and other voice characteristics.

John McCormick, deputy editor of WSJ Pro Artificial Intelligence, wrote an excellent article about voice recognition called, “What AI Can Tell From Listening to You”.[i] McCormick said that by analyzing voice data, an AI can determine a person’s emotions, mental and physical health, height and weight, and detect if you are depressed, in danger of a heart attack, or dozing at the wheel of your car. The AI voice technology is already in use in a number of application areas. The following paragraphs contain some examples.

A major area of opportunity is in mental health. McCormick said,

In the U.S., mental illnesses affect one in five adults, or 46.6 million people in 2017, according to the National Institute of Mental Health, which estimates only half of those needing treatment receive it. Emerging voice technology may be able to make problems easier to spot.[ii]

CompanionMx Inc., a digital health technology company spawned out of the Massachusetts Institute of Technology, has introduced a mobile app, called Companion. Patients are encouraged to talk to the app describing how they are feeling. The app securely records the voice and extracts features which are indicative of digital biomarkers correlated with symptoms of mental health.[iii] Using AI, the relevant data and trends are available to a clinician who can make better clinical decisions to improve the mental health of the patient. Researchers who have studied the Companion system have found it very encouraging.[iv]

Another interesting area of opportunity is keeping drivers awake. McCormick said,

More than 800 Americans died falling asleep behind the wheel in 2015, according to October 2017 statistics from the National Highway Traffic Safety Administration, and more than 30,000 people were injured in crashes involving drowsy drivers.

Many cars already have voice recognition for making phone calls or telling the car GPS system where you want to go. Some cars have external cameras to help avoid collisions. Cars could also have a camera on the dashboard watching you. McCormick said,

Now, many major car companies and artificial-intelligence companies are designing AI that uses voice analysis, along with facial recognition, to assess the alertness and emotional state of a driver.

Toyota Motor Corporation displayed a demonstration vehicle at last year’s Consumer Electronics Show which can read facial expressions and voice tones. If onboard AI detects you are exhibiting signs of being tired, the car’s voice assistant could alert you and suggest you pull over to take a break. McCormick said that the AI in the car could engage you in a conversation and, over time, learn what topics of discussion are most likely to keep you alert.[v] It could also blast you with loud music to help keep you awake.

Other areas where McCormick reported AI and voice analysis will eventually have a large impact include:

Humanizing the call center

Hiring the Right Candidates

Fighting fraud

Investigating crimes[vi]


Stay tuned for more on these areas next week.

[i] John McCormick, “What AI Can Tell from Listening to You,”  Wall Street Journal (2019), https://www.wsj.com/articles/what-ai-can-tell-from-listening-to-you-11554169408

[ii] Ibid.

[iii] “The Companion System Has Three Key Components,”  CompanionMx (2019), https://companionmx.com/product/

[iv] McCormick, “What AI Can Tell from Listening to You”.

[v] Ibid.

[vi] Ibid.

Tagged with: , , ,

Precision Hip Replacements

When I first started having knee pain many years ago, an orthopedic surgeon told me he thought the problem may have arisen because one of my legs is a fraction of an inch shorter than the other. Such a condition may exist with many people who never notice the difference. However, if you are running marathons, which I was back then, the slight difference in leg length can become noticeable and lead to skeletal pain somewhere.

​When someone gets a hip replaced, a similar problem may arise. After the replacement, it is not uncommon for the leg attached to the replaced hip to end up a little longer or shorter than it was before. If the leg-length changes by less than 3/8 of an inch, the body is able to adapt and compensate. However, if the length change is more, posture changes can develop, which can lead to back pain. A team of scientists at the Fraunhofer Institute for Machine Tools and Forming Technology based out of Chemnitz, Germany, may have a solution to the problem.

The artificial hip developed by the scientists is adjustable. After the hip is surgically inserted into the patient, the surgical team uses a computerized camera system to measure the exact length of the leg. If the length needs to be adjusted, the surgeon can turn a screw which connects the artificial hip’s femoral stem and the ball and socket. Once the length measurement matches the proper length, the implant can be secured and the surgery completed. The adjustable system is in the testing phase, but the scientists are hopeful it will be ready for clinical adoption within two years. In the future, I envision a procedure involving very precise measurements being taken from MRI or CAT scans and then a precision hip replacement will be created using a 3-D printer.

Tagged with: , ,
Page 3 of 394
1 2 3 4 5 394
Top