“Data beats opinions.” That’s a saying that never gets old – and the more data you have access to, the better. With AppExchange App Analytics, ISV partners can get usage data on how users interact with your AppExchange solutions. Getting this type of data and related analytics has a lot of benefits for partners, like identifying attrition risks and helping to make feature development decisions.

In this blog post, I will walk through the necessary steps for deploying the App Analytics app and an AWS stack that will fetch and store App Analytics daily log files in the AWS S3 bucket. We’re going to get technical, so grab your caffeinated beverage of choice and let’s dive in.

Advertisement

Also, I would like to thank my Colleague Jeremy Hay Draude who is the ISV Platform Expert responsible for Partner Intelligence. To request a consultation, simply log a case through the partner communityPre-requisites:

  • Salesforce Partner Business Org’s username with API enabled
  • Existing AWS account or you can sign up for a new account at aws.amazon.com

and choose “Customer 360 Platform Expert Consultation” as the SubTopic.

Available as a free app via the Salesforce Labs program, you can find the app on the AppExchange. This app has “Log pull Config” object that stores the “Log pull configuration” (appname, packageid) and also “Log Pull Activity” for tracking the log pull activity.

Disclaimer: AWS stack in this blog post is provided as-is and as a general guidance. You are responsible for everything including all the expenses incurred.

Let’s start with how you’ll need to setup Salesforce. This setup should take around 15 minutes.

Step 3: Configure the Log pull records

Pre-requisites: Make sure your partner business org (PBO) is enabled for the daily logs. Verify you can use this free app and make sure you can retrieve a file for data type: Package Usage Log. If it is not enabled please open a support case as per this doc.

  • AppName: The name of your app. No special characters can be used.
  • AppPackageId – Main PackageID of your app (this will be 15 characters of your Package ID that is associated with your AppExchange listing). : This means you will strip the last three characters from your 18-character Package ID.
  • Packages: If you have only one package associated with the AppExchange listing, copy the same value as in AppPackageId in this field. If you have multiple extension packages associated with the main package, then enter comma delimited extension packages along with the AppPackageIdm e.g. packageid1, packageid2, etc.
  • ExpectedVolume: Select this value as a ‘HIGH’ if package log volume is high (this will pull log files per day by making 24 hourly requests). Select this value as a ‘LOW’ if log volume is going to be low (this will pull log files per day in a single request).

Step 2: Assign Permission set “PILabappConfigPSL” to the users (those going to access PILabapp and also API user that’s going call AppAnalyticsRequest API from AWS)

Go to the application “PI Labapp”. In the “Log Pull Config” tab, create a new record as shown below.

Step 1: Install the stack

  • Stack currently is updated for two regions: us-east-1 and ap-northeast-1 (Tokyo)
  • In the AWS console, change the region to us-east-1 or ap-northeast-1 (This is located in the top second right menu item).

Now, let’s get AWS in a good place. This setup should take around 20 minutes.

  • Go to AWS Services → CloudFormation → Stacks
  • Click “Create stack” with the option “With new resources (standard)”, select all defaults thereafter and use
  • For us-east-1 region use this S3 URL. For ap-northeast-1 region use this S3 URL
  • Name the stack, accept all defaults and create the stack

Step 2: Update secret keys with PBO login credentials

    Go to AWS Services → Secret Manager

Pre-requisite: Use your existing AWS instance or signup for a new AWS instance.

Note: This will support other regions soon. We are working through additional configuration.

Find the secret starting with TemplatedSecret*** and update with your PBO username and password. Note: Make sure to append security token to your password.

Below is a short video showing Steps 1 and 2 (Open the image in a new tab for best viewing).

Another saying that will never go out of style: Test, test, test. It’s time to walk through some manual testing. This will take around 30 minutes.

Step 2: In AWS, run Step function

  • Go to AWS Services → Step Functions → State machines
  • Select the state machine that starts with the name “piappLogRequestStateMachine***”
  • Click “New Execution” and then “Start Execution”
  • Monitor the status: If everything goes well, all steps except the last should go green. Note: The last step will be red (failed) as it is related to Athena which we have not yet set up, so ignore for the time being

Step 1: In Salesforce, create Log Request records manually. In PBO org, go to “PI Labapp-Config” app, go to “Bulk Log Request” tab, select an app and create records to request log pull for a couple of days in the past (less than today’s date).

This should create Log Pull request under “Log Pull Activity” tab. In Step 2, when we run AWS Step function, it will process these records.

  • Go to AWS Services → S3
  • Find the bucket that starts with the name “piappjscdkstack-piapppidailylogbucket***”
  • You should see a folder structure similar to the below image and daily log files under the appropriate date folders

Step 3: In Salesforce, validate the Status. In PBO org, go to “PI Labapp-Config” app, go to “Log Pull Activity” tab and you can see the logs.

Step 4: In AWS, validate S3 files

    Under “Log Pull Config” pf your app record, update the “EnableDailyPull” flag to true

Step 2 (Optional): In AWS:

    Go to AWS Services → CloudWatch->Rules and change the timing of the run

Now that manual testing is done, let’s enable the Daily Log Pull job. This will take around 10 minutes.

Step 1: Run Glue crawler to detect the parquet table from S3 bucket

Step 1: In Salesforce PBO org:

Step 2: Run the Athena queries : Before you run your very first Athena query, you need to set up a query result location in Amazon S3. Select the existing bucket name starting with “pilabapp-piapppiathenaoutbucket***”

    In AWS, go to Go to AWS Services → Athena → Athena query editor

Step 3: Create Views

  • Select the “piappdb”
  • Run this query “MSCK REPAIR TABLE apps2” to update Log partitions
  • Select the “piappdb”
  • Open the gist which has View definitions
  • Copy each View creation query (e.g. logins, entity) one by one e.g.
    Validate the table “SELECT * FROM “piappdb”.”apps” limit 10;”

We’re almost done! Let’s spend 15 minutes getting AWS Athena up and running.

In AWS, go to Go to AWS Services → Glue. Click on crawler, and run the crawler so it detects a table created from S3 bucket where daily log files are stored(name starting with piappjscdkstack-piapppidailylogbucket***). Wait for the status to be “Ready” and you should see one new table created.

In AWS, go to Go to AWS Services → Athena → Athena query editor,

This last step is completely optional, but we still wanted to include it. At this point, you are ready to point Tableau to Athena table/views and start creating dashboards on the App Analytics data.

Note: Tableau Desktop or online is required. Don’t have this yet and are interested? You can get a 14-day free trial here.

Here are a few of the questions we get most often.

Q: How can I monitor AWS step function failures?

A: Go to Step function execution history, select an execution, and put a cursor on different steps in the workflow and check Input, Output, Exceptions etc for each step. You can even check steps input/output for the inner loops (change the index value in the dropdown)

Q: Can I change the deployed AWS step function and Lambda code directly in AWS console?

A: Absolutely yes.

Q: How do I update my existing AWS stack with the new version?

A: Go to CloudFormation, select the existing stack, click on the Update button and click select “Replace current template”, and use the same S3 URL.

For us-east-1 region, use this link for the S3 URL.

For ap-northeast-1 region, use this link.

Then, click Next. This should update your AWS stack.

Having a lens into how visitors are interacting with your solutions on AppExchange can prove vital in your strategy and planning. I hope this blog post has helped you configure your Salesforce PBO org and AWS stack so it can start pulling the daily log files and start getting these insights.

Please click here to read the original article as posted on Medium.

We source the web to bring you best Salesforce articles for our reader’s convenience. If you want to have this article removed, please follow guidelines at Digital Millennium Copyright Act (DMCA)..