AWS for Industries
Limiting Subscriber Churn by leveraging real-time subscribers’ feedback – part 2 of 2
In the first blog post of this series we introduced a serverless approach for CSPs to capture and process subscribers’ performance in real time. We covered how, by leveraging AWS real-time analytics services, CSP can identify subscribers with a significant rate of service impairment. Incident events are stored on a Data Lake, whilst overtime performances are tracked and stored on a DynamoDB for every subscriber.
This second blog post explores how CSP can leverages their events store and subscribers’ real-time performances to directly engage subscribers and capture their sentiment about recent performance issues. With a combination of natively integrated AWS services, spanning end-to-end from real-time analytics, block storage, customer engagement, and serverless compute to serverless NoSQL DB, CSP shall:
- Identify the worst-performing subscribers with advanced analytics in real time
- Engage subscribers on the basis of their historical network and service performance
- Capture their feedback to validate and weight their incidents
- Monitor the sentiment of the entire subscriber-base in real time and associate it to other dimensions (cell, device, network vendor, etc.)
- Prioritize incident intervention based on subscriber sentiment
This blog post covers points 2–5, building on top of the foundational layer described in first blog post; refer to the diagram below for the architecture being covered.
Here are the functional steps of the solution covered in this second blog post:
- Every time a new incident is recorded, the subscriber’s incident profile is evaluated
- When an event profile indicates a low-quality service, Amazon Connect immediately contacts the subscriber
- Through Amazon Connect, the subscriber provides feedback on how the most recent event affected their customer experience.
- The feedback is recorded, and the DynamoDB table is updated with the subscriber’s sentiment
- CSP Operations leverage Amazon Quicksight to visualize in real time the cells which had been experiencing the largest degradation of sentiment across the subscribers base, so to prioritize remediation.
Technical description
The solution demonstrated in this post is built in the Europe (London) Region. You can choose any other AWS Region where the following services are available:
- Amazon S3
- AWS Lambda
- Amazon Simple Queueing Service
- Amazon DynamoDB
- Amazon Connect
- AWS Glue
- Amazon Athena
- Amazon QuickSight
For more information about AWS Regions and where AWS services are available, visit the Region Table.
The following prerequisites must be in place to build this solution:
- An AWS account
- The AdministratorAccess policy granted to your AWS account (for production, you should restrict access as needed)
- Event records about CSP service performance fed by a third-party OSS probe-based monitoring solution
Engagement Handler
The following diagram illustrates the engagement handler block of the architecture. It is triggered by an Amazon DynamoDB stream and, based on the nature and severity of the incident, it decides whether to engage the impacted subscriber. If the decision is affirmative, the block triggers the subscriber engagement block and the engagement update block.
For illustrative purposes, service instance names are shortened in the diagram.
SQS 2 is called engagement_update_queue
Lambda 3 is called Read-after-incident
FIFO SQS 2 is called engagement_trigger_queue.fifo
SQS 2: engagement_update_queue
Create Amazon SQS queue
Complete the creation steps contained in Creating an Amazon SQS queue (console), with the specifics:
- At point 3, choose Standard
- At point 4, type the name engagement_update_queue
- At points 5a to 5e, keep the default choices
- At point 6, choose Basic method
- At Define who can send messages to the queue selection, choose Only the queue owner
- At Define who can receive messages from the queue selection, choose Only the queue owner
- At point 7, enable encryption
- At Point 8, enable dead-letter queue
Why standard queue?
This standard queue is used to decouple the two lambda functions. It is not strictly required for order (update to Amason S3 can happen in whatever order), hence the choice of a standard queue.
FIFO SQS 3: engagement_trigger_queue.fifo
Create Amazon SQS queue
Complete the creation steps contained in Creating an Amazon SQS queue (console), with the specifics:
- At point 3, choose FIFO
- At point 4, type the name engagement_trigger_queue
- At points 5a to 5e, keep the default choices
- At point 5f, Enable content-based deduplication
- At point 6, choose Basic method
- At Define who can send messages to the queue selection, choose Only the queue owner
- At Define who can receive messages from the queue selection, choose Only the queue owner
- At point 7, enable encryption
- At Point 8, enable dead-letter queue
Why FIFO SQS
The FIFO SQS is chosen primarily:
- For backing up engagement messages which trigger Amazon Connect, which has dependencies on telephony communication carriers for outbound calls
- For message order—a single subscriber might have experienced two types of incidents in a five-minute window, and it is important that the engagements are triggered in the right order.
Lambda3: Read-after-incident
Create Lambda function
Complete the steps contained in Create Lambda Function with the following amendments:
- Type Read-after-incident in the Function name field
- Select Python 3.9 as runtime version
- Keep the default execution role selection
Upload the code
Complete the following steps:
- Select the newly created Lambda function from the main dashboard where all Lambda functions are displayed
- Beneath the Function overview panel, Select the Code tab
In the Code Source panel, copy and paste the code found at GitHub and follow the instructions contained in the README file.
Set up trigger
Complete the following steps:
- In the Function overview panel, choose Add trigger
- Select DynamoDB in the Trigger configuration panel
- Select the DynamoDB table previously created (Subscriber_table) from the dropdown menu of the DynamoDB field
- Choose Latest from the Starting Position dropdown menu
- Choose Enable Trigger
- Choose Add
Set up destination
No destination is configured for this Lambda function.
Permissions
Following the least permission policy, please complete the following steps:
- Beneath the Function overview panel, select the Configuration tab, then follow for the Permissions section (as illustrated below)
- Select the role name within the Execution Role panel to open up the IAM console page
- Five permissions policies must be enabled, whose JSON representation is here reported, where AWSaccountnumber is to be replaced with your AWS account number:
- Get Item from Amazon DynamoDB Subscriber_table table
{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": "dynamodb:GetItem", "Resource": "arn:aws:dynamodb:eu-west-2:AWSaccountnumber:table/Subscriber_table" } ] }
- Send Message to engament_trigger_queue.fifo and enagement_update_queue
{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": "sqs:SendMessage", "Resource": [ "arn:aws:sqs:eu-west-2:AWSaccountnumber:engagement_trigger_queue.fifo", "arn:aws:sqs:eu-west-2: AWSaccountnumber:engagement_update_queue" ] } ] }
- Lambda Execution Role to log events
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "logs:CreateLogGroup", "Resource": "arn:aws:logs:eu-west-2: AWSaccountnumber:*" }, { "Effect": "Allow", "Action": [ "logs:CreateLogStream", "logs:PutLogEvents" ], "Resource": [ "arn:aws:logs:eu-west-2:AWSaccountnumber:log-group:/aws/lambda/Read-after-incident:*" ] } ] }
- AWSLambdaInvocation-DynamoDB permission policy
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "lambda:InvokeFunction" ], "Resource":["arn:aws:lambda:eu-west-2: AWSaccountnumber:function:Read-after-incident" ] }, { "Effect": "Allow", "Action": [ "dynamodb:DescribeStream", "dynamodb:GetRecords", "dynamodb:GetShardIterator", "dynamodb:ListStreams" ], "Resource":["arn:aws:dynamodb:eu-west-2: AWSaccountnumber:table/Subscriber_table/stream/name_of_the_stream" ] } ] }
- AWSLambdaDynamoDBExecutionRole permission policy
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "lambda:InvokeFunction" ], "Resource":["arn:aws:lambda:eu-west-2: AWSaccountnumber:function:Read-after-incident" }, { "Effect": "Allow", "Action": [ "dynamodb:DescribeStream", "dynamodb:GetRecords", "dynamodb:GetShardIterator", "dynamodb:ListStreams" ], "Resource":["arn:aws:dynamodb:eu-west-2: AWSaccountnumber:table/Subscriber_table/stream/name_of_the_stream" ] } ] }
- Get Item from Amazon DynamoDB Subscriber_table table
Environment Variables
Still in the Configuration tab, follow for the Environment Variables section. Configure the following environment variables by replacing AWSaccountnumber with your account number:
- Key = trigger_eng_queue_url, Value = https://sqs.eu-west-2.amazonaws.com/ AWSaccountnumber /simulation_queue.fifo
- Key = update_eng_queue_url, Value = https://sqs.eu-west-2.amazonaws.com/ AWSaccountnumber /engagement_queue
- Key = call_drop_threshold, Value = 0.16
- Key = call_quality_threshold, Value = 0.18
- Key = low_bitrate_threshold, Value= 0.09
- Key = video_stalling_threshold, Value = 0.11
Engagement Update
The following diagram illustrates the engagement update block of the architecture. It writes engagement events into the data lake block, and it is triggered by Amazon SQS of the engagement handler block.
For illustrative purposes, service instance names are shortened in the diagram.
Lambda 4 is called write_engagement_to_S3
Lambda4: write_engagement_to_S3
Create Lambda function
Complete the steps contained in Create Lambda Function with the following amendments:
- Type write_engagement_to_S3 in the Function name field
- Select Python 3.9 as runtime version
- Keep the default execution role selection
Upload the code
Complete the following steps:
- Select the newly created Lambda function from the main dashboard where all Lambda functions are displayed
- Beneath the Function overview panel, select the Code tab
- In the Code Source panel, copy and paste the code found at GitHub and follow the instructions contained in the README file.
Set up trigger
Complete the following steps:
- In the Function overview panel, choose Add trigger
- Select SQS in the Trigger configuration panel
- Select the engagement_update_queue from the dropdown menu of the SQS Queue field
- Choose Latest from the Starting Position dropdown menu
- Choose Enable Trigger
- Choose Add
Set up destination
No destination is configured for this Lambda function.
Permissions
Following the least permission policy, please complete the following steps:
- Beneath the Function overview panel, select the Configuration tab, then follow for the Permissions section (as illustrated below)
- Select the role name within the Execution Role panel to open up the IAM console page
- Three permissions policies must be enabled, whose JSON representation is here reported:
- Receive from and delete message on SQS queue engagement_update_queue
{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ "sqs:DeleteMessage", "sqs:ReceiveMessage", "sqs:GetQueueAttributes" ], "Resource": "arn:aws:sqs:eu-west-2:AWSaccountnumber:engagement_queue" } ] }
- Put Object, Get Object, and List Bucket permission for the Amazon S3 bucket previously created, in this post called “event_bucket”
{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ "s3:PutObject", "s3:GetObject", "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::event_bucket", "arn:aws:s3:::event_bucket"/*" ] } ] }
- Lambda Execution Role to log events
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "logs:CreateLogGroup", "Resource": "arn:aws:logs:eu-west-2: AWSaccountnumber:*" }, { "Effect": "Allow", "Action": [ "logs:CreateLogStream", "logs:PutLogEvents" ], "Resource": [ "arn:aws:logs:eu-west-2: AWSaccountnumber:log-group:/aws/lambda/write_engagement_to_S3:*" ] } ] }
- Receive from and delete message on SQS queue engagement_update_queue
Environment Variables
Still in the Configuration tab, follow for the Environment Variables section. Configure the following environment variables by replacing AWSaccountnumber with your account number:
- Key = bucket_name, Value = event_bucket
Subscriber Engagement
The following diagram illustrates the subscriber engagement block of the architecture. It is triggered by the engagement trigger from Amazon SQS, and it triggers an outbound voice call to the subscriber and triggers the feedback and sentiment update block.
For illustrative purposes, service instance names are shortened in the diagram.
Lambda 5 is called Trigger_Connect
Lambda5: Trigger_Connect
Create Lambda function
Complete the steps contained in this Create Lambda Function with the following amendments:
- Type Trigger_Connect in the Function name field
- Select Python 3.9 as the runtime version
- Keep the default execution role selection
Upload the code
Complete the following steps:
- Select the newly created Lambda function from the main dashboard where all Lambda functions are displayed
- Beneath the Function overview panel, select the Code tab
- In the Code Source panel, copy and paste the code found at GitHub and follow the instructions contained in the README file.
ContactFlowId, InstanceId, SourcePhoneNumber must be updated with the Amazon Connect instance defined here
Set up trigger
Complete the following steps:
- In the Function overview panel, choose Add trigger
- Select SQS in the Trigger configuration panel
- Select the engagement_trigger_queue.fifo from the dropdown menu of the SQS Queue field
- Choose Latest from the Starting Position dropdown menu
- Choose Enable Trigger
- Choose Add
Set up destination
No destination is configured for this Lambda function.
Permissions
Following the least permission policy, please complete the following steps:
- Beneath the Function overview panel, select the Configuration tab, then follow for the Permissions section
- Select the role name within the Execution Role panel to open up the IAM console page
- Three permissions policies must be enabled, whose JSON representation is here reported:
- Start an Outbound Voice call to the subscriber
{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": "connect:StartOutboundVoiceContact", "Resource": "arn:aws:connect:eu-west-2:AWSaccountnumber:instance/instanceID/contact/ContactID" } ] }
- Receive from and Delete message on SQS queue engagement_trigger_queue.info
{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ "sqs:DeleteMessage", "sqs:ReceiveMessage", "sqs:GetQueueAttributes" ], "Resource": "arn:aws:sqs:eu-west-2:AWSaccountnumber:engagement_trigger_queue.fifo" } ] }
- Lambda Execution Role to log events
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "logs:CreateLogGroup", "Resource": "arn:aws:logs:eu-west-2: AWSaccountnumber:*" }, { "Effect": "Allow", "Action": [ "logs:CreateLogStream", "logs:PutLogEvents" ], "Resource": [ "arn:aws:logs:eu-west-2: AWSaccountnumber:log-group:/aws/lambda/Trigger_Connect:*" ] } ] }
- Start an Outbound Voice call to the subscriber
Amazon Connect
Create an Amazon Connect instance
Complete the steps in Create an Amazon Connect instance with the following amendments:
- At point 3 of Step 1, choose Store users within Amazon Connect
- At point 4 of Step 1, type Access URL
- At point 1 of Step 3, do NOT choose I want to handle incoming calls with Amazon Connect
- At point 2 of Step 3, choose I want to make outbound calls with Amazon Connect
The following page should display; choose Let’s go.
Set up phone number
Complete the steps explained in Claim a phone number in your country to claim a phone number in your country.
Please refer to the list of countries you can call with Amazon Connect (outbound voice call) by following the Amazon Connect service quotas and scrolling down to the section entitled “Countries you can call.”
Set up routing
In the Navigation panel, choose Routing, Queues. Maintain the BasicQueue created by default.
Create contact flows
- In the Navigation panel, choose Routing, Contact Flows.
- Choose Create contact flow
- Beside the grayed out Save button, select the downward arrow and choose Import flow (beta)
- Select the Subscriber_engagement.json workflow found at GitHub and save it with the name Subscriber_engagement
- In the Navigation panel, Choose Routing, Phone Number
- Select the claimed phone number
- In the Contact flow / IVR field, choose Subscriber_engagement contact flow
Feedback and Sentiment Update
The following diagram illustrates the feedback and sentiment update block of the architecture. It is triggered by an Amazon Connect instance, updates the DynamoDB Subscriber_table with the feedback collected from the subscriber, and updates the sentiment score of the subscriber.
For illustrative purposes, service instance names are shortened in the diagram.
Lambda 6 is called Collect_feedback_from_Connect
Lambda6: Collect_feedback_from_Connect
Create Lambda function
Complete the steps contained in this Create Lambda Function with the following amendments:
- Type Collect_feedback_from_Connect in the Function name field
- Select Python 3.9 as runtime version
- Keep the default execution role selection
Upload the code
Complete the following steps:
- Select the newly created Lambda function from the main dashboard where all Lambda functions are displayed
- Beneath the Function overview panel, select the Code tab
- In the Code Source panel, copy and paste the code found at GitHub and follow the instructions contained in the README file.
Set up trigger
No trigger is configured for this Lambda function. It is invoked within Amazon Connect.
Set up destination
No destination is configured for this Lambda function.
Permissions
Following the least permission policy, please complete the following steps:
- Put Object, Get Object, and List Bucket permission for the Amazon S3 bucket previously created, in this post called “event_bucket”
{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ "s3:PutObject", "s3:GetObject", "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::event_bucket", "arn:aws:s3:::event_bucket/*” ] } ] }
- Get Item in the DynamoDB Susbcriber_table
{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": "dynamodb:GetItem", "Resource": "arn:aws:dynamodb:eu-west-2:AWSaccountnumber:table/Subscriber_table" } ] }
- Generate KMS Data Key for recording calls
{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": "kms:GenerateDataKey", "Resource": "arn:aws:kms:eu-west-2: AWSaccountnumber:key/*" } ] }
- Lambda Execution Role to log events
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "logs:CreateLogGroup", "kms:GenerateDataKey" ], "Resource": "arn:aws:logs:eu-west-2: AWSaccountnumber:*" }, { "Effect": "Allow", "Action": [ "logs:CreateLogStream", "logs:PutLogEvents" ], "Resource": [ "arn:aws:logs:eu-west-2: AWSaccountnumber:log-group:/aws/lambda/write_engagement_to_S3:*" ] } ] }
Environment Variables
Still in the Configuration tab, follow for the Environment Variables section. Configure the following environment variables by replacing AWSaccountnumber with your account number:
- Key = bucket_name, Value = event_bucket
Offline Analytics
The application explored in this post uses the captured sentiment for the entire subscriber base to identify the cells that had the largest change of sentiment, in order to prioritize operations. Nonetheless, other use cases can be based on the collected data.
The following diagram illustrates the offline analytics block of the architecture. It reads incident records, engagement records, feedback records, and sentiments.
Amazon Athena
Using AWS Glue to connect to data sources in Amazon S3
Complete the steps at Using AWS Glue to Connect to Data Sources in Amazon S3 to connect Amazon Athena to an AWS Glue Crawler, option A in the ‘Setting up a Crawler’ section.
Refer to the AWS Glue section for next steps.
AWS Glue
Add a crawler
Complete the following steps to add a crawler:
- On the AWS Glue service console, on the left-hand side menu, choose Crawlers.
- On the Crawlers page, choose Add crawler. This starts a series of pages that prompt you for the crawler details.
- In the Crawler name field, enter Sentiment_Crawler, and choose Next.
- Crawlers invoke classifiers to infer the schema of your data and use CSV by default.
- For the crawler source type, choose Data stores.
- For the Repeat crawls of S3 data stores, choose Crawl all folders, then choose Next.
- On the Add a data store page, choose the Amazon S3 data store.
- For the option Crawl data in, choose Specified path. Then, for the Include path, enter the path where the crawler can find the flights data, which is s3://event_bucket/sentiment. Choose Next.
- The crawler needs permissions to access the data store and create objects in the AWS Glue Data Catalog. To configure these permissions, choose Create an IAM role. The IAM role name starts with AWSGlueServiceRole-, and in the field, you enter the last part of the role name. Enter sentiment, and then choose Next.
- Create a schedule for the crawler. For Frequency, choose Run on demand, and then choose Next.
- Choose Add database to create a database. In the pop-up window, enter sentimentdb for the database name, and then choose Create.
- Use the default values for the rest of the options, and choose Next.
- Verify the choices you made in the Add crawler wizard. If you see any mistakes, you can choose Back to return to previous pages and make changes.
- After you have reviewed the information, choose Finish to create the crawler.
Add Security Configurations
Enable S3 encryption mode, CloudWatch logs encryption mode, Job bookmark encryption mode by following instructions found at Working with Security Configurations on the AWS Glue Console.
Run the crawler
- To run the crawler, select the Sentiment_Crawler and choose Run it now.
- When the crawler completes, a banner appears that describes the changes made by the crawler.
- In the Navigation panel on the right-hand side, choose Tables.
- Choose the sentiment table
- The Data structure resembles the structure in the screenshot below
Amazon QuickSight
Signing in to Amazon QuickSight
Before creating an Amazon QuickSight dashboard, please sign in to Amazon QuickSight by completing the steps in Signing in to Amazon QuickSight.
Create a new analysis
- On the Amazon QuickSight start page, choose New analysis.
- Choose Athena.
- Create a new data source by entering sentimentdb in Data source name.
- Select AWSDataCatalog in the Catalog: contain sets of databases.
- Select sentiment in the Database: contain sets of tables.
- Select sentiment table.
Create a new dashboard
- From the below view, choose Create Analysis
- Name the Analysis cells with incidents with highest decrease of sentiment
- Choose Heatmap visual type
- Group by cell
- Size by distinct count of phone numbers
- Color by change_sentiment
The heatmap represents radio cells. Their size is proportional to the number of unique subscribers having incidents on those cells, while the color tracks the extent of overall sentiment decrease. Solid red large cells are top priority for intervention.
Alternative engagement services
In this blog post, I have used Amazon Connect as a tool to engage subscribers on the basis of the patterns of service and network performance; alternative options are here explored.
Amazon Pinpoint
Amazon Pinpoint email, voice, push notification, and SMS channels offer deliverability and scale to reach hundreds of millions of customers around the globe.
Amazon SNS – Mobile Push
With Amazon SNS, you have the ability to send push notification messages directly to apps on mobile devices. Push notification messages sent to a mobile endpoint can appear in the mobile app as message alerts, badge updates, or even sound alerts.
Conclusion
AWS services provide CSPs with the ability to build a customer in-context engagement solution that meets the needs of their evolving operations.
This solution helps CSP validate network and service incidents directly with subscribers in real time. By directly capturing their sentiment following recurring incident patterns, CSP can prioritize operations with the objective of reducing churn and minimizing the strain on first-line support.
As always, AWS welcomes feedback. Please submit comments or questions in the comments section.
Contribution
- Technical description – Christian and Angelo (AWS)
- Intro, Solution, Conclusion – Christian Finelli (AWS), Angelo Sampietro (AWS), Ludovica Chiacchierini, Tom Edwards