Circular Dependency Error

Question

• Hello

I´m trying to add this formula to the data model:

EMA: (Momentum - Previous Day EMA) x 0.0769 + Previous Day EMA

Where
Momentum: (Today Close - Previous Day Close)

But when I build the calculated field I get:
"A circular dependency was detected: 'FactTable'[EMA],'FactTable'[EMA]."

I cannot see how to write the formula to avoid the error.

Here´s the current (incorrect) formula:

EMA
=
(
FactTable[MOMENTUM]
- (
MAXX (
FILTER (
ALL ( Calendar ),
MAX ( Calendar[TradingDayNumber] ) - 1
),
[EMA]
)
)
)
0.0769
+ (
MAXX (
FILTER (
ALL ( Calendar ),
MAX ( Calendar[TradingDayNumber] ) - 1
),
[EMA]
)
)

Wednesday, July 23, 2014 9:45 PM

• It's really up to you.  There is definitely a performance hit when iterating over larger data sets, but is the performance hit so dramatic that you are willing to sacrifice accuracy?

My last calculation looks like it is standing up very well against the manual EMA done in excel once you get past that first 80-100 days.  I'm not an expert on EMA but from what I've read, a 22 day EMA needs at least 100 days of history to become completely accurate so that is also inline with my formula.  After that 80+ day "start-up period," it is basically identical to the manual calculation for the next 2+ years of data in workbook.

I see a slight performance hit on the workbook compared to the DATESBETWEEN version but it isn't anything that I would call unusable.  It takes 1-2 seconds for me to update compared to the other one which was almost instant.  I think 1-2 seconds is an acceptable response time especially considering the complexity of the calculation.

If it isn't acceptable for you or you are using a much larger data set to test against, then maybe a window is needed.  Just know that you are sacrificing accuracy to some degree.  Your window should probably be as large as you can stand to make it which is probably at least 3-4 times the size of [Factor].

I would probably go about it with something along these lines:

=CALCULATE(
AVERAGEX(
DIVIDE(
SUMX(FILTER(Calendar, Calendar[TradingDays] = 1), [EMA Exp]))
),
FILTER(
ALL(Calendar),
Calendar[FullDate]>=MAX(Calendar[FullDate]) - ([Factor] * 4)-1 &&
Calendar[FullDate]<=MAX(Calendar[FullDate])

)
)

Just know that this definitely isn't doing as well in matching up with the manual excel calculation and the performance gain is minor.

Really it is your call to make.

Which ever way you go, if you are going to calculate different EMA periods, I would definitely go the route of adding a disconnected table with your possible Factor/Period values.  Something simple like this:

It can have whatever EMA periods you might want to look at.

Then, if you call this table Period, you can update [Factor] to this:

=MIN(Period[Period]))

Now, you should be able to use this as a slicer to automatically update [EMA] for a different period. Or you can use it in the Columns/Rows of you pivot to see all the EMA periods.  Finally, you could create simple hard coded period EMA's like this:

EMA25 = CALCULATE([EMA], Period[Period] = 25)

Or

EMA200 = CALCULATE([EMA], Period[Period] = 200)

All without directly changing or copying all the underlying measures.

I'd still be interested to see if any one else out has a completely different take on a solution to this problem.

• Marked as answer by Thursday, August 07, 2014 7:19 PM
Wednesday, August 06, 2014 10:49 PM

All replies

• Hi NicoPer,

It looks like this is happening due to the CALCULATE function - I'm assuming that the [EMA] reference in the second argument for MAXX is a calculated field. This article discusses circular dependencies and how to fix them: Understanding Circular Dependencies in Tabular and PowerPivot.

If you need some help applying this in your scenario, could you kindly provide a sample workbook that we can download from a publicly accessible location such as in OneDrive or Dropbox. Alternatively, you can paste some test data here, briefly describe the relationships, and paste the DAX formulas for relevant calculated columns and calculated fields.

Regards,

Michael

Website: nimblelearn.com, Blog: nimblelearn.com/blog, Twitter: @nimblelearn

Thursday, July 24, 2014 9:33 AM
• Hi Michael

Thanks for your reply. I´ve seen that page but still cannot find the way.

As requested I´ve uploaded a sample spreadsheet so you can see the problem:
Circular Dependency in DAX

Thursday, July 24, 2014 8:14 PM
• Hi NicoPer,

I've taken a look at the workbook and have understood what you're trying to achieve. Unfortunately, this type of recursion isn't supported in DAX. Most likely because it can result in an infinite loop which would cause Excel to run out of memory.

Regards,

Website: http://www.nimblelearn.com
Blog: http://www.nimblelearn.com/blog

Wednesday, July 30, 2014 1:51 PM
• So there´s no possible way to do it with DAX because of the recursion. ok.

But there *must* be a way to add it to the data model , right?

I was trying to change the formula to remove the recursion with no luck. (found a formula but the results are not even close.)

I recently found some people talking about building it in SQL and populate then to excel. Would that be possible?

I´m have no experience with SQL, but if this can be done easily with it,  I want to know about it.

Can anyone give me some directions?

Wednesday, July 30, 2014 8:10 PM
• The most elegant Exponential Moving Average (EMA) solution written in SQL that I've seen is the last one in this blog post. It may be possible to convert the logic into a DAX syntax as it doesn't appear to use any recursion. This is something I'll be looking into when I get a bit more time to sit down and dissect the logic, although I invite any other forum members to take a look at this also.

Regards,

Website: http://www.nimblelearn.com
Blog: http://www.nimblelearn.com/blog

Thursday, July 31, 2014 1:48 PM
• Thank you very much for the link.
I´d love to see that code turned into DAX.

Thursday, July 31, 2014 8:45 PM
• Here is a stab at it.  If this isn't right, I believe it is headed in the proper direction so hopefully it can at least help inspire other ideas.

I broke this out into 5 separate measures so I could keep track of it all.  It gets complicated so I will list each measure in the order they should be added and explain what it does (or at least what I intended it to do).

I'm sure there are opportunities to consolidate and/or optimize these but step one is getting something in place that works.

1. SumDate - I couldn't get measure number 2 to function properly without first creating this as a separate measure:

SumDate:=
IF(
ISBLANK(
SUM(FactTable[CLOSE])
),
BLANK(),
SUM(Calendar[FullDate])
)

2. Rnk - This measure is intended to dynamically rank the calendar days that had a CLOSE value:

Rnk:=
RANKX(
FILTER(
SUMMARIZE(ALL(Calendar),
Calendar[FullDate],
"Sum",
SUM(FactTable[CLOSE])
),
[Sum]>0),
[SumDate],,0,DENSE
)

3. EMA Exp - This measure is intended to calculate the weighted exponent for each day based on its dynamic rank.  I defaulted the window to 25 since that's what you appeared to be using for your smoothing factor (.0769). The window can be changed by updating the 25 in both places in this formula:

EMA Exp:=
(1-(1-(2/(25+1))))*(1-(2/(25+1)))^
RANKX(
ALLSELECTED(Calendar),
[Rnk],,1,DENSE)
)

4. WeightedClose - This is simply the CLOSE times its exponential weighting:

WeightedClose:=
SUM(FactTable[CLOSE])*[EMA Exp]

5 EMA - This is the final measure which has now basically become a standard weighted average:

EMA:=
DIVIDE(
SUMX(
Calendar,
[WeightedClose]
),
SUMX(
Calendar,
[EMA Exp]
)
)

Here is a comparison of the results I am getting with this calculation vs. a standard average:

As you can see, in the EMA calculation, the more recent dates carry more weight in the monthly and yearly subtotals than do the older dates when compared to the True Average.

So, hopefully this either works for you or can help inspire better solutions.

Friday, August 01, 2014 7:57 PM
• Hi Mike

Thank you very much for stepping in.

I´ve tried it, but I´m getting exactly the same values as the CLOSE for the day.
(copied the formulas exactly).

I´ve even changed the window to 5 instead of 25 and still get the CLOSE.
Friday, August 01, 2014 9:36 PM
• I'll sleep on it.  I am pretty sure this is headed in the right direction though because I think the subtotals are correct as far as exponential weighing. Now the filter context just needs to be adjusted to make it running/moving.
Friday, August 01, 2014 11:07 PM
• Try replacing the 5th measure EMA, with this:

=CALCULATE(
AVERAGEX(
DIVIDE(
SUMX(FILTER(Calendar, Calendar[TradingDays] = 1), [EMA Exp]))
),
DATESBETWEEN(
Calendar[FullDate],
LASTDATE(Calendar[FullDate])-24,
LASTDATE(Calendar[FullDate])
)
)

Not sure if this is 100% correct yet, but I think it is another step closer. Might be a performance concern on an enormous FactTable but there may be opportunities to optimize some of this if the logic is right. I'm getting sub second response times on slicers in the sample workbook.

Here is how this calculation compares to the CLOSE:

Let me know how you make out.

Monday, August 04, 2014 9:03 PM
• Hi Mike

Just saw it. It seems to be giving pretty good results (and also very fast).
I´ll run a few more tests today/tomorrow comparing them with the original EMA formula and post the results with some charts.

Thanks a million!

Tuesday, August 05, 2014 12:29 AM
• Sorry for the delay. Great, a big step closer.

Here is a comparison with a simple moving average (SMA) and the real EMA 25:

As you can see, it seems to be reacting a bit faster than the correct EMA.

I´m not sure but I think that the DATESBETWEEN part of the formula is also counting non-trading days, so if you enter -24 the real period is much shorter, so it reacts faster.

Also I´ve tried to combine the last 3 calculated fields into one with no luck. The purpose is to have different EMAs (diff periods) without having to make a ton of calculated fields.

Would that be possible? That´ll also help running double smoothed EMAs like an EMA 10 of an EMA 20.

Thank you again very much for your support.

Wednesday, August 06, 2014 4:53 AM
• You are correct about how DATESBETWEEN is counting non-trading days which is why I mention that this was likely closer but not 100% correct.

After thinking about it, I don't even thing it is needed because the [EMA Exp] measure really takes care of the period.

I think this works better for EMA:

=CALCULATE(
AVERAGEX(
DIVIDE(
SUMX(FILTER(Calendar, Calendar[TradingDays] = 1), [EMA Exp]))
),
FILTER(
ALL(Calendar[FullDate]),
Calendar[FullDate] <= MAX(Calendar[FullDate])

)
)

This doesn't match up with yours exactly on the early days but the gap shrinks as time goes on and by 5/30/12 they are identical when rounded to 2 decimals. Not sure if this is because the Excel calculation has a defined starting point that is based on the previous SMA rather than the previous EMA.

As far as consolidating measures and creating different periods, it is probably best to make sure the logic works for one first.  The combined measures will be much more complex due to filter context and therefore a lot tougher to troubleshoot.

One thing you could do now to test different period scenarios a little more easily is create a constant measure called something like [Factor]:=25 then then replace the "25's" in [EMA Exp] with [Factor].  Then all you would have to do is change the value of [Factor] in one place.  If you wanted to get real fancy, you could make the value of [Factor] selectable in a slicer by using a disconnected table.

Wednesday, August 06, 2014 2:41 PM
• That fixes the values. But it seems that the performance diminishes drastically as I think you´re forcing it to go though all history. We´d probably still need to add a window to improve performance.

I´m trying with this:

=IF([Sum of CLOSE]=BLANK(),BLANK(),(CALCULATE(
AVERAGEX(
DIVIDE(
SUMX(FILTER(Calendar, Calendar[TradingDays] = 1), [EMA Exp]))
),
FILTER(
ALL(Calendar),
Calendar[FullDate]<=MAX(Calendar[FullDate])
)
)))

But the results are not as good as with your last formula. Is reacting a bit faster than the correct EMA.
I don´t know if I´m missing something.

Wednesday, August 06, 2014 8:12 PM
• It's really up to you.  There is definitely a performance hit when iterating over larger data sets, but is the performance hit so dramatic that you are willing to sacrifice accuracy?

My last calculation looks like it is standing up very well against the manual EMA done in excel once you get past that first 80-100 days.  I'm not an expert on EMA but from what I've read, a 22 day EMA needs at least 100 days of history to become completely accurate so that is also inline with my formula.  After that 80+ day "start-up period," it is basically identical to the manual calculation for the next 2+ years of data in workbook.

I see a slight performance hit on the workbook compared to the DATESBETWEEN version but it isn't anything that I would call unusable.  It takes 1-2 seconds for me to update compared to the other one which was almost instant.  I think 1-2 seconds is an acceptable response time especially considering the complexity of the calculation.

If it isn't acceptable for you or you are using a much larger data set to test against, then maybe a window is needed.  Just know that you are sacrificing accuracy to some degree.  Your window should probably be as large as you can stand to make it which is probably at least 3-4 times the size of [Factor].

I would probably go about it with something along these lines:

=CALCULATE(
AVERAGEX(
DIVIDE(
SUMX(FILTER(Calendar, Calendar[TradingDays] = 1), [EMA Exp]))
),
FILTER(
ALL(Calendar),
Calendar[FullDate]>=MAX(Calendar[FullDate]) - ([Factor] * 4)-1 &&
Calendar[FullDate]<=MAX(Calendar[FullDate])

)
)

Just know that this definitely isn't doing as well in matching up with the manual excel calculation and the performance gain is minor.

Really it is your call to make.

Which ever way you go, if you are going to calculate different EMA periods, I would definitely go the route of adding a disconnected table with your possible Factor/Period values.  Something simple like this:

It can have whatever EMA periods you might want to look at.

Then, if you call this table Period, you can update [Factor] to this:

=MIN(Period[Period]))

Now, you should be able to use this as a slicer to automatically update [EMA] for a different period. Or you can use it in the Columns/Rows of you pivot to see all the EMA periods.  Finally, you could create simple hard coded period EMA's like this:

EMA25 = CALCULATE([EMA], Period[Period] = 25)

Or

EMA200 = CALCULATE([EMA], Period[Period] = 200)

All without directly changing or copying all the underlying measures.

I'd still be interested to see if any one else out has a completely different take on a solution to this problem.

• Marked as answer by Thursday, August 07, 2014 7:19 PM
Wednesday, August 06, 2014 10:49 PM
• Great post! So a factor of approx. 4 times is needed if the EMA doesn´t start from an SMA.

I was concerned about the performance because the database is huge having all history, so a limited window helps.

You pretty much solved it all elegantly. The slicer with a period is very cool and now we can have very short and easy EMA formulas with fixed periods.

Thank you so much.

For the people reading this, I leave you a simple chart with 2 EMAs:

Cool eh? Thank Mike..

Thursday, August 07, 2014 7:18 PM

Iterating over a large dataset should normally be avoided but given the recursive nature of the normal EMA calculation I didn't see a way around it for this one.

As far as the 4 multiplier on the [Factor] for the time period window, that was somewhat arbitrary.  You might play with that number knowing that decreasing it gets further away from the more accurate EMA but will speed up the performance and increasing it will do the exact opposite.

Looking at the graph of [EMA Exp], which really does most of the "magic" by controlling the weighting, you can see that even the weighting carries past the designated period but levels out significantly at that point.

Here is a graph of what the [EMA Exp] looks like by days back when [Factor] =25 days:

It is very steep before 25 then it starts to level out, approaching zero but never actually reaching zero.  Obviously at some point the number gets so small that it might as well be zero. The period in the EMA is basically just designating when that "leveling out" occurs. My thinking was that if the EMA period was 25 then making the window approximately 100 (4X25) includes a majority of the days with significant weighting.

Not perfect, but it sounds like we found a good balance!

Thursday, August 07, 2014 8:27 PM
• Hi,

Just wondering if you could share the final working version of the model? I am trying to replicate but can seem to bring it together.

Monday, September 01, 2014 4:49 AM
• Tuesday, September 02, 2014 10:46 PM
• This might be a stupid question (relatively new to power pivot), but wondering if you might have an idea on the best direction.

I have used the examples above to help me build an ema on a report i have for daily received volumes, but what i was wondering is if there is a way to do the calculation on a metric rather than a fact table? The reason is i am looking at building a MACD indicator (stock market tool) to identify trends. The first part of the equation is the 'MACD' figure which is 12 period EMA - 26 period EMA, and the second part is 'signal line' which is a 9 period EMA of the MACD.

I have been going round in circles trying to work out how it could be done.

Your thoughts, or anyone else's here, would be greatly appreciated.

Phil.

Wednesday, September 03, 2014 4:20 AM
• I´m looking for something similar myself with another indicator. Still gathering info for the moment.

You´d have better results if you start another thread, and don´t forget to add a sample spreadsheet.
(If you need it to have an EMA as mentioned, just link to this page as a reference.)

Wednesday, September 03, 2014 6:29 PM