Debugging Einstein Analytics

Fellow Trailblazers,

In this article, I am going to touch upon some of the governance side of EA implementation highlighting a few out of the box EA debugging techniques which are relevant for any EA implementation across any scale or size. Indeed building on Einstein platform is fun because as an EA SME you have the power to build user-intuitive EA assets for your end-users on the platform in just a few clicks.

It’s fun until we get stuck with bugs or issues related to low performant dashboards (SAQL queries) or some syntax troubleshooting based errors in implementations. Being able to debug is a core skill for every EA SME or consultant. It separates the wheat from the chaff. In the end, it increases your own value immensely, since you are a problem solver.

Disclaimer: This article focuses only on Einstein Analytics development, specifically some of the debugging techniques.

..

“Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.” — Brian Kernighan

..

How can you understand better and quicker?

Some SME’s are geniuses and understand everything in little time. If you are like me, you do not understand a lot by just glancing at the JSON code. You might need to run the dashboard a couple of times and experiment with it to discover some properties of its functionality. preventing bugs is often related to proper architecture and schema design which makes it easier to reason about the dashboard functionality.

Don’t forget to join our EA Success Community

There are a lot of ongoing discussions within our trailblazer community and by our own EA evangelists, Antonio Scaramuzzino & Terence Wilson from the EA COE team, explaining how to approach big scale implementations? vs Business KPI’s refinements / Performance testing etc.

Lastly, I would encourage you all to join our EA success community or enroll yourself for campfire sessions or read public knowledge articles from the product team to deep dive more.

Topics Overview :

Below are the key topics I am going to touch upon in this article.

  • Browsers
  • Browser Caching
  • Browser Network Console
  • EA DB Inspector
  • EA Intermediate (Hidden) Vs Unused Steps
  • EA Dataflow Notifications
  • Free available tools & Utilities

Lets Deep Dive :

1. Browsers :

Performing your test on multiple browsers in EA might have different outcomes. Having the right browser enablement is crucial for providing seamless user experience to your end-users. Make sure to have up to date stable browser versions while performing your tests in EA.

EA support is available to all browsers such as Microsoft Edge, Microsoft Internet Explorer version 11, and the most recent stable versions of Mozilla Firefox and Google Chrome.

Key Considerations :

  • Analytics isn’t supported on Apple Safari
  • The minimum screen resolution required to support all Salesforce features is 1024 x 768 pixels.

2. Browser Caching :

Background: The browser cache is a temporary storage location on your computer for files downloaded by your browser to display websites. Files that are cached locally include any documents that make up a website, such as Html files, CSS style sheets, JavaScript scripts, as well as graphic images and other multimedia content.

When you revisit a page, the browser checks which content was updated in the meantime and only downloads updated files or what is not already stored in the cache. This reduces bandwidth usage on both the user and server-side and allows the page to load faster. Hence, the cache is especially useful when you have a slow or limited Internet connection.

Cache Warming:

It is designed to help reduce the execution times of the initial queries executed against a database. This is done by preloading the database server’s cache with pages that were referenced the last time the database was started. Warming the cache can improve performance when the same query or similar queries are executed against a database each time it is started.

The advantages of cache warming are that you can try and get content into your cache ready for user traffic without making them experience slow, non-cached delivery times. This means that your users will put less traffic on the backend servers — delivering a better UX experience for your end-users.

EA Backend/Infrastructure:

There is no difference between sandboxes & production instance resource allocations (Datacenter) in the EA backend. But you might experience some variations with slow sandbox orgs or less performant queries in comparison with PROD instances. That is usually because Production orgs have much more utilization, and data for those orgs tend to be always available in the cache, while Sandboxes are used less frequently and so data is not always in the cache and needs to be loaded directly from disk.

These kinds of issues can be improved when we try to load the dashboard from scratch multiple times (3/4) so that we make sure the browser cache is warm. Loading from scratch here means to clear the local browser cache and refreshing your browser.

3. Browser Developer Console :

Every modern web browser includes a powerful suite of developer tools. These tools do a range of things, from inspecting currently-loaded HTML, CSS, and JavaScript to showing which assets the page has requested and how long they took to load. This is another great utility I tend to use more often to inspect my EA SAQL queries in the network tab. Let’s explore the chrome browser developer console a few attributes for this example.

Step 1: Right Click on the browser → Press Inspect

Step 2: Right Click on the browser → Press Inspect → Network Tab

  • 01 – Network Tab
  • 02 – Request Status (HTTP Status Code)
  • 03 – Total Number of Requests
  • 04 – Query runtime
  • 05 – Filter
  • 06 – Waterfall –
    • Start Time. The first request that was initiated is at the top.
    • Response Time. The first request that started downloading is at the top.
    • End Time. The first request that finished is at the top.
    • Total Duration. The request with the shortest connection setup and request/response is at the top.
    • Latency. The request that waited the shortest time for a response is at the top.
  • 07 – Browser Concurrency – 6 queries are executed at a time which is standard browser concurrency

Waterfall:

It is a visual breakdown of each request’s activity. Use this to view the timing of requests in relation to one another. By default, the Waterfall is organized by the start time of the requests. So, requests that are farther to the left started earlier than those that are farther to the right. When you hover over this attribute you can simply explore the detailed run time of the executed query.

Manual Clear Browser Cookies :

To manually clear browser cookies at any time, right-click anywhere in the Requests table and select Clear Browser Cookies.

Export requests data :

As a power user, you can export the requests data and share with Salesforce support engineers to deep dive and investigate the issue occurring on your dashboard. To save all network requests to a HAR file:

  • Right-click any request in the Requests table.
  • Select Save as HAR with Content. (Note – DevTools saves all requests that have occurred since you opened DevTools to the HAR file. There is no way to filter requests, or to save just a single request)

4. EA Dashboard Inspector :

EA standard ‘Dashboard Inspector’ is a great standard utility to inspect your dashboard performance within Einstein Analytics. This has been added on the platform few releases earlier and the product team keeps expanding to provide more user intuitively information related to EA dashboard debugging in few clicks.

  • It identifies different types of bottlenecks, like query issues and redundant queries.
  • It provides recommendations to improve performance.

I myself tend to combine the results from this with my network console very often to narrow down the perf topics. It is very essential that your EA assets should be highly optimized & performant as much as possible based upon the given EA best practices by the product team.

Key Considerations :

If a dashboard contains multiple pages, run the inspector on each page. The inspector provides results only for the current page. Also from Summer 19 release, we can also inspect the failed steps via using this utility Prior to Summer ’19, query level information was only available if the query ran successfully. In Summer ’19 and later, when a step has errors and fails to execute, open the Dashboard Inspector to deep dive the root cause in few clicks.

5. EA Intermediate (Hidden) Vs Unused Steps :

It is essential to have the right steps placements on the pages or for your dashboard. Equally important that your dashboards should not be cluttered with unused steps or having any Intermediate (hidden) steps which you are not aware or not clear with the significance of having that step in our EA asset page. Lets’s quickly understand the terms first –

  • Unused Steps: These steps are not utilized in any form by any EA widgets or reference in any pages or layout of your EA dashboard at all. These can be easily identified from the UI by selecting the ‘Unused Steps’ tab from the dashboard canvas (Snapshot Below)
  • Intermediate (Hidden) Steps: These steps are never visualized but might be used in your business use case when you intend to combine multiple datasets maybe with results binding or maybe conjugation with your SOQL steps to retrieve the values to populate another step values etc.

.

In several complex use-case and continuous development across teams on the platform might lead towards adding n number intermediate (hidden steps) which might be referenced for any reasons on your dashboard but not consumed from your business perspective at all. Now the big question is how as a power user or EA consultants should identify these kinds of intermediate (hidden used steps). Let’s take an example.

I have 19 Total Steps in my dashboard below. Out of 19, I have only one intermediate (hidden) step which I would like to identify it from the dashboard. Neither I am able to get any clue from the UI. Now imagine if you have an asset with 20 EA pages along with 250 EA components implemented. How will you debug faster in those scenarios?

After selecting the ‘Unused’ tab from the canvas your intermediate (hidden step) is not visible. Because it is not consumed on the layout but it is used or referenced in any form in your JSON with other steps for your use case.

In order to handle these kinds of dashboard JSON scenarios. I prefer to use any online JSON editor and to compare my EA dashboard JSON with its respective states. (Snapshot Attached)

6. Salesforce Workbench :

Salesforce workbench is another great utility I tend to use more often to retrieve my EA Metadata. Sometimes it gets trickier to retrieve the EA asset technical names or testing your SOQL step query against the data. Workbench makes it way quicker for me. I would highly recommend to use this utility more often in your EA engagements & build phase.

7. EA Dataflow Notifications :

We can also set dataflow notifications to receive an email notification when a dataflow job finishes.

You can be notified only –

  • When there are warnings, only when the dataflow fails,
  • Or every time the dataflow finishes.
  • You can also set an elapsed time notification to notify you when a dataflow is still running after a specified length of time.

Key Considerations :

By default, email notifications are sent only to the person setting the notifications. Other people with the Edit Analytics Dataflows permission can subscribe to a dataflow by setting notifications.

8. Free Available tools & Utilities

Below are some of the free applications & tools which I use very often to perform my EA relevant debug tasks. I would continue to add more and more in the future.

  • Network and Browser Performance -> <your salesforce org url>/speedtest.jsp
  • Browser Versioning Update:
  • JSON Editor :
  • Cache Clearing Chrome Plugins:
    • Clean Master – Link
    • Super History & Cache Cleaner – Link

.

Conclusion : (My Learnings)

As a Solution Architect, I often enter projects, which have limited established debug workflows or governance policies related to EA Implementation. Sometimes there’s no accountability policy at all. Hence issues might even be ignored completely.

Additionally, it’s a very wide topic to build a whitepaper on. Keep in mind the product is aggressively expanding or some of the techniques might evolve or supersede in the near future (safe harbor). We as a product company itself learning a lot from our customers or from odd use cases shared by our trailblazers. Our Product team tirelessly working hard to build tools and applications to help our users to resolve/identify the issues as fast as possible. Remember – This is an ongoing process & continuous development efforts 🙂

If a developer or consultant is assigned to an issue it is often fixed by trying. You might have already guessed that this is the most ineffective approach to solve a problem. So what comes first is -> Analysis

Grab all the information you can get by using some of the above tools, which are extremely helpful if they are used in the right way. Don’t forget to align yourself with basics EA best practices (Design & Data) while narrowing down your root cause

.

By doing this –

  • It helps you align or scale your EA implementation with SFDC recommended best practices from time to time
  • You can reduce the issue resolution time drastically by providing the appropriate background to your issue to our product engineers.
  • Lastly, it increases your own value immensely, since you are a problem solver or a true trailblazer in every sense.

.

For the new audience on this topic, I hope it gave you a fair idea about some debugging techniques on Einstein Platform.

Let me know your feedback or comments.

I hope this helps.

Cheers!
Varun

One thought on “Debugging Einstein Analytics

  1. I appreciate the way you bring this up, Varun. Keep this blog growing 🙂

Leave a Comment

Your email address will not be published. Required fields are marked *