Sam was excited to use Asana AI Studio to help with the org’s capacity planning, but a few things were happening (or…not happening).

Diagnosing

There are a few key things I check when I’m first trying to diagnose a broken rule, especially an AI Studio rule:

  1. Assess complexity: Are we asking too much?

  2. Evaluate guidance: Was the AI provided enough context to take action?

  3. Check the Custom Field: Is the custom field set in the correct format?

  4. Review the Task Log: What did AI Studio tell us about it’s decision?

Step 1. Assessing Complexity

On the surface, Sam’s rule didn’t read as complex. However, for the AI agent, there was a lot being asked. Each custom field being populated by AI needed to follow specific protocol; whether that be binary logic, evaluating a pre-existing matrix, or evaluating existing capacity based on open projects.

Having this many evaluations isn’t a bad thing, but it does make diagnosing and debugging tricky. It results in long lists of activity logs, not to mention a ton of AI Studio Credits.

One key issue the AI Studio Rule was having was incorrectly summing the other fields and populating the total into a separate custom field (resulting in a sum greater than 100%). This is a perfect use-case for Roll-up Custom Fields.

Assessment: Pretty Complex
Recommendation: Consider basic rules for some of the less tricky steps and reserve the Use AI feature for the heavier lift pieces.
Recommendation: Use roll-up fields to add up each custom field’s percentage to reduce risk of mathematical error for AI.

Step 2. Evaluate Guidance

My next step was to take a look at the Guidance dialog box. Often times, we don’t give our AI agent enough context to make accurate decisions. Considering the complexity of this rule, that was my first guess.

Assessment: Sam’s guidance was robust and clear. She properly attached additional documentation to aid in decision-making and referenced projects and teams to provide access.

Step 3. Double Check Custom Fields

One of the ways this rule was failing was that it was not providing the numbers as percentages. This immediately flagged to me that there may be something up with the way the numeric custom fields were set-up. This turned out to be the case for a couple of custom fields that were set as their default numeric value.

Assessment: Custom fields may not be set-up to return percentages
Recommendation: Confirm settings for numeric fields and ensure they are set to percentages.

Step 4. Read the output

Arguably, this could be done as step 1, but for me it was important to glean what exactly this rule was about before I dug into the logs. Completing the other steps allowed me to gain more context around what this workflow was supposed to accomplish and where it may have gone wrong.

Because of the complexity, the logs for this rule are long. Each step where Asana AI Studio takes an action is recorded. When I recreated the automation in my sandbox using a template rubric, this was the log for just one custom field:

Recommendation: Locate an example task that failed to provide the correct output and read through each of the logs. Make note of each decision that was either a) made erroneously, e.g. did not properly follow the guidance, or b) failed to run at all.


Building with Intention

While we weren’t able to solve Sam’s issues during the webinar, I was able to send her off with some actionable steps to get closer to the answer.

When building complex rules like this one, I highly recommend testing as you build. Start with your first branch, run a test, and review the logs. Rinse, repeat. Make sure that each AI Studio action is properly functioning before adding the next.

More Resources

Here are some additional relevant resources that may help:

Join future Asana AI Office Hours

Click here to view all upcoming Asana events, including our monthly Asana AI Studios Office Hours. This open Q&A session is your chance to get live answers and hands-on guidance from our team. Whether you’re fine-tuning a prompt, exploring AI Studio features, or looking for best practices, bring your questions and use cases for a collaborative, real-time learning experience.

If you already know what you’d like to ask, you can submit your question using this form. Not sure yet? Don't sweat it! We will be taking questions on the call.