Debunking the NTSA’s 2018 "Improving Retirement Savings for America's Public Educators" Study
It’s Been A Long Time Coming
In 2018, the National Tax-Deferred Savings Association conducted a survey and created a research paper. The paper “Improving Retirement Savings for America’s Public Educators” has been widely circulated to school districts, politicians, educators, and reporters. However, it’s unreliable, in my opinion. This is not a peer-reviewed research paper that was submitted to a journal. There is no detailed published methodology, and almost none of the underlying data is publicly available to scrutinize.
This research paper appears to be another in a long line of misinformation pieces designed to further the interests of insurance agents, brokers, and the companies they work for. I’ve drawn out every claim in the paper and analyzed it to see if the evidence supports it; even I was surprised at the outcome. Buckle up, this is a long one.
What follows is each claim, followed by my response.
Claim:
“Our research suggests that limiting choice of providers is directly correlated to decreased participation in the plan.”
Response:
It doesn’t say our research concludes; it says “suggest.” This has not stopped the NTSA from using this document as a tool to infer that the study is conclusive.
Claim:
“On average, account balances are 73% higher among plans with 15 or more providers compared to single provider arrangements.”
Response:
On the surface, this seems like irrefutable evidence that single-vendor plans are a terrible idea. But if you spend just a few minutes thinking about potential causes for this disparity, you’ll find this claim is deliberately misleading.
When a plan goes from multiple vendors to a single vendor, most existing 403(b) accounts do not move to the new provider. The number of accounts that move depends on who wins the bid, how much effort is put into helping the deselected vendor accounts move into the new plan, and how many existing accounts of the winning vendor are automatically converted into the new program. Every district has different results. For a district that has never gone through a consolidation before, there might be a lot of accounts that never make a move to the new provider. The account at the new provider will take many years to build up an account balance.
If the NTSA had provided access to methodology and the data, perhaps we could say with certainty where this dramatic increase in account balance comes from, but they didn’t. The NTSA wants you to believe that correlation is causation. They want you to think your account balances will be lower if you go to a single vendor.
In the above statistic we have no idea whether the single vendor plans were always single vendor or recently single vendor. Time matters in this case.
Single vendor plans are not likely to include the balances of former employees, whereas multiple vendor plans will. When a plan goes single vendor, former employees are not generally the target for moving money into the new program. In fact, if these former employees are targeted at all, they will likely be targeted to rollover their money to an IRA instead of into the plan as sometimes the new vendor’s reps make more money on rollovers than exchanges.
In addition, the NTSA is using “account balances” in this statistic, not participant balances. While subtle, this could account for the entire difference. Let’s say you have a plan with 15 vendors that have 100 employees, 30 of whom are contributing, and they have an average account balance of $10,000. This plan then goes to a single vendor. In the most extreme case, no one exchanges their money to the new provider; only new contributions make up the balances. In this case, if we look only at the account balances of the single vendor after one year, there will be a dramatic difference as the single vendor has only had 12 months to accumulate assets. However, the accounts that existed before the change still exist. The overall participant balance (add all the accounts of each participant together) will be higher than the previous, on average.
The NTSA does not compare apples to apples. At best, they do not understand the most basic premise in statistics (correlation is not causation), and at worst, they are deliberately misleading.
Some states require school districts to offer multiple vendors, and these districts have been doing so for decades; thus, their account balances will be very high, on average, compared to a district that transitions to a single vendor. This stat shouldn’t be used as a point of comparison as it’s so biased.
Age could account for the difference or some of the difference. Is it possible that single-vendor plans tend to have younger employees than multiple-vendor plans? Yes, it’s possible. Of course, we can’t know because the NTSA will not produce the data.
Could retirement incentives influence the difference in account balances? Employers that offer retirement incentives generally deposit the money into a 403(b), which creates higher account balances. Does this account for the difference? It’s highly doubtful, but I’m at least posing plausible and testable reasons for the differences. The NTSA simply says the difference must be due to a single vendor while providing no evidence for the claim.
Claim:
“Based on our research, monthly contributions into 403(b) plans average $322 monthly, or $3,864 per year (assuming contributions continue throughout the summer months). Using the average teacher salary of $55,919, the contributions to 403(b) plans are roughly 6.9% of salary on average.”
Response:
This document states on page 3 that the average participation rate is 27.13% and then produces the above claim. I have some issues with how the data is presented.
1. I appreciate that they consider the summer months, but only some educators contribute to a 403(b) with their summer paychecks if they receive one. Thus, the average monthly contribution is likely to be inflated. I’m unclear why they wouldn’t have taken an average of the prior twelve months of contributions and divided it by the number of people contributing for each of those months.
2. Why are we using the average teacher salary when we don’t know what percentage of participants are teachers? Often, school district employees have non-teacher staff that can exceed 45% of total employment. The average teacher’s salary does not give us a proper contribution percentage.
3. The 6.9% salary number is not the average contribution percentage of the school district; it’s the average contribution percentage of those who are contributing. 73% of employees are not contributing to the plan. If 27% contribute $3,864 annually and 73% contribute 0%, the weighted average contribution per employee is $1,043 annually. Using the average teacher salary that the study references (again, we should not do this), we would find that the percentage contribution rate is 1.865% or less than 2%. This number is also wrong; we need much better data to determine the contribution percentage. I’m just using the methodology presented by the NTSA.
Claim:
“As with corporate programs, the research did reveal that participation and contribution rates are positively correlated with salaries. States that have the highest average teacher salaries, such as New York, New Jersey and Connecticut, also had the highest participation and contribution rates.”
Response:
This is the most accurate statement in this document. It’s highly likely that the main factor driving participation is the average salary. This should not be understated. The states with the higher participation and contribution rates are high-paying—in other words, there appears to be good data to suggest that participation rates might be driven more by income level than the number of vendors. Pay educators more, and participation rates will go up. I’m sure the NTSA can agree with me on this.
Claim:
“The range of participation rates in America’s public school districts is dramatic, suggesting that the choices that each school district makes available to employees and the resources that they provide to help employees understand the benefits of participation are key differences in driving participation rates. As the chart below indicates, the participation rates for the 4,473 school districts in the survey ranges from less than 7.53% to more than 99.14%.”
Response:
Literally, a paragraph earlier, a correlation between income and participation rates is made, but there is no further commentary. Then in this paragraph, the author simply cites a range of data and then draws a conclusion from the data that simply isn’t there, that “choices that each school district makes available to employees and the resources that they provide to help employees understand the benefits of participation are key differences in driving participation rates.” Perhaps they will provide data in the coming paragraphs. Still, the data cited don’t support the conclusion, it’s pure conjecture. The author literally could “suggest” anything to explain the wide variation.
I’m going to “suggest” some reasons why there could be a wide disparity in participation rates. None of these suggestions are the reason; they are just plausible.
1 Income
2 Presence of a 457(b)
3 Economic conditions differ between employers
4 Political conditions differ between states
5 Average age of the employee population
6 Presence of a match
7 Ease of enrollment
8 Transparency of expenses
9 Easily available information on the plan
10 Lack of availability of information on the plan
11 Objective education
12 Biased education
13 Normal age of retirement in the state Defined Benefit plan
14 Availablity of Social Security
15 Whether health insurance is a paid benefit
I could go on and on here. My point is not that any of the above items are responsible for the range of participation rates that the research found but that the conclusion that the range is somehow the result of “choices that each school district makes available to employees” is a claim not supported by data. Many factors could drive participation, but the number of vendors offered is unlikely to be the primary reason.
Claim:
“The research revealed that the number one factor driving participation and savings rates in school districts is participant choice.”
Response:
No, it didn’t.
Whatever can be considered research in this document, it did not indicate using any standards of evidence “that the number one factor driving participation and savings rates in school districts is participant choice.” This claim is simply asserted, not supported.
In my opinion, the outcome was determined before the study took place. No hypothesis was tested here.
Claim:
“An essential element in the broader array of provider choice is the breadth of education in the form of affiliated advisors who educate, encourage, and advise these workers onsite.”
Response:
Another statement was made without proper evidence. This claim is even more insidious as it pushes the false narrative that professional salespeople who work for or are affiliated with 403(b) vendors are actually “advisors.” There is no evidence provided that those who enroll school employees are “advisors” in the legal sense of the term. The overwhelming majority of people who enroll school employees do not always work in an Investment Advisor Representative capacity while providing sales and service to public school employees. The term “advisor” is abused throughout the report as a catchall for anyone providing sales or service support.
Most people working in the government 403(b) space work for companies that are not full-time Investment Advisor Representatives (IAR). Instead, they generally act as insurance agents and Registered Representatives (or Brokers). This means they are held to different duties than someone who is an Investment Advisor Representative. These salespeople might sometimes offer a service as an IAR, but they are not obligated to act in that capacity at all times. Employees should not believe that the representatives of vendors roaming their campuses are acting in a fiduciary capacity.
If the claim had been that participation rates are likely to be driven higher by the presence of objective plan education, I would be able to get behind it. The claim, however, is that more vendors equals more presence of salespeople, and the greater presence of salespeople creates higher participation (I realize the claim is “advisor” and not “salesperson”). However, this claim has not been tested in any real way. It’s entirely possible that the presence of individuals who do not have the participant’s best interest in mind and are simply there to sell a product could be what is keeping participation low. That said, intuitively, the more people selling something, the more likely that product will be sold. Another flaw in this research is that the quality of the products sold is not being evaluated.
Claim:
“There is 25% greater participation in plans with 15 or more investment providers compared to plans with only one provider.”
Response:
What plans were included? We don’t know as the NTSA will not release the data. None of the data has been released. We should not accept research that cannot be tested.
Were all possible single vendor plans included in this “research”? We don’t know because the NTSA will not release the data.
It’s unlikely that all single-vendor plans are included in the data because the entity putting the data together is often cut off from receiving data from single-vendor plans. When a 403(b) plan reduces vendors to just one, they often change the compliance administrator. A compliance administrator is more likely to be used in multiple-vendor plans because they are more challenging to manage. The fact that the only data being used in this research is from entities that are almost exclusively multiple-vendors is a significant bias in the data.
New York City is probably the largest single vendor 403(b) in the nation; its data is not included in these numbers. We know the New York City data is not included because its 403(b) plan is not serviced by any compliance administrators that provided data.
Four compliance administrators provided data, but those four were not disclosed. My guess is that it was The OMNI Group, TSA Consulting Group (Omni and TSACG are now one company), Carruth Compliance Consulting, and NBS (though it’s plausible one of them is Equitable’s PlanConnect). This leaves out TCG and the vendor-based compliance entities from Voya (Plan with Ease), Equitable (PlanConnect), and Corebridge (Retirement Manager), as well as several regional compliance administrators and employers that do their own compliance. There is a high likelihood that the data is skewed based on the type of employers who would need to hire a compliance administrator.
Charter schools, which are more likely to be single-vendor in my experience (I do not have data to back this claim), were excluded from the dataset.
What about 457(b)? Many school districts have added 457(b) plans over the years. In many cases, these plans are single vendor and offer a better option to employees. The study referenced by the NTSA does not consider the possibility that 403(b) participation may drop. Still, overall participation among employer plans (both 403(b) and 457(b)) may not have experienced any drop. This is another plausible but unexplored reason that participation could drop when the number of vendors is reduced.
Claim:
“On average, account balances are 73% higher among plans with 15 or more providers compared to single provider arrangements.”
Response:
On the surface, this seems like irrefutable evidence that single-vendor plans are a terrible idea. But if you spend just a few minutes thinking about potential causes for this disparity, it becomes clear that the NTSA is deliberately misleading.
When a plan goes from multiple vendors to a single vendor, most existing 403(b) accounts do not move to the new single provider. The number of accounts that move depends on who wins the bid and how much effort is put into helping the deselected vendor accounts move to the new vendor. Another factor is how many existing accounts of the winning vendor are automatically converted into the new program. In addition, surrender charges from existing low-quality vendors prevent accounts at de-selected vendors from moving to the new vendor. Every district has different results. For a district that has never gone through a consolidation before, there might be a lot of accounts that never make the move to the new provider. It will take the new provider many years to build high account balances.
If the NTSA had provided access to methodology and the data, perhaps we could say with certainty where this dramatic increase in account balance comes from, but they didn’t. The NTSA wants you to believe that correlation is causation. They want you to think that if you move to a single-vendor, your participant account balances will be lower. They’ve presented no evidence to support this claim.
In the above statistic we have no idea whether the single vendor plans were always single vendor or recently single vendor—time matters.
Single vendor plans are not likely to include the balances of former employees, whereas multiple vendor plans will. When a plan goes to a single vendor, former employees are not generally the target for moving money into the new program. In fact, if these former employees are targeted at all, they will likely be targeted to rollover their money to an IRA instead of into the plan as sometimes the new vendor’s reps make more money on rollovers than exchanges.
In addition, the NTSA is using “account balances” in this statistic, not participant balances. While subtle, this could account for the entire difference. Let’s say you have a plan with 15 vendors that have 100 employees, 30 of whom are contributing, and they have an average account balance of $10,000. This plan then goes to a single vendor. In the most extreme case, no one exchanges their money to the new provider; only new contributions plus growth make up the balances. In this case, if we look only at the account balances of the single vendor after one year, there will be a dramatic difference as the single vendor has only had 12 months to accumulate assets. The single vendor plan will appear to have fewer assets. Perhaps the NTSA accounted for this? We can’t know as they did not release the data or methodology.
Some states require their school districts to offer multiple vendors and have for decades; thus, their account balances will be very high on average compared to a district that goes single vendor. This stat shouldn’t be used as a point of comparison as it’s so biased.
Age could account for the difference or some of the difference. Could single-vendor plans tend to have younger employees than multiple-vendor plans? It’s plausible. Of course, we can’t know because the NTSA will not produce the data.
Could retirement incentives influence the difference in account balances? Employers that offer retirement incentives generally deposit the money into a 403(b), which creates higher account balances. Does this account for the difference? I doubt it, but I’m at least posing plausible and testable reasons for the differences. The NTSA simply says the difference must be due to a single vendor while providing no evidence for the claim.
Claim:
“There is a 203% increase in average contribution rates among plans providing access to 15 or more providers compared to plans with only one provider.”
Response:
This is a tortured claim. We cannot analyze this claim by going to the data; the authors do not make the data available. The NTSA wants you to believe that contribution rates will dramatically increase by adding up to 14 or more vendors to an employer plan (so that it has at least 15). The implication is that other interventions won’t produce a similar increase. I call bullshit.
First, I don’t trust the data. Why should I? It’s quite literally not provided. What methodology was used? The claim is such an outlier you would think the researchers would go back and try to figure out why they are getting such results.
Let’s assume the data is accurate. What factors could affect the results?
The region the single vendor districts reside in. Are single vendor districts overrepresented in low-salary areas while multiple vendor districts reside in higher-income areas? We don’t know; they don’t tell us.
2. Is a 457(b) available in these districts (they often are with single-vendor districts)? Perhaps participation has shifted to that program.
3. Do the lower contribution rate districts belong to systems that contribute to Social Security? If an employee contributes to a pension and Social Security, is it plausible to assume it could affect average contribution rates?
4. Are the districts’ demographics skewed in one direction or the other? A younger district will have lower contribution rates. Is a single vendor correlated with younger demographics? We don’t know.
5. Perhaps the single vendor plans they are including did not have a good transition when they went single vendor. If the employer did not automatically move salary reduction agreements to the new recordkeeper and had to get new signatures, participation, and contribution rates would be reduced. It’s vital to have a wise transition plan.
6. It could have been sabotage. The industry desperately seeks to stop any reduction in vendors. You think I’m joking, but when I was working on a bill in California many years ago, an industry flier circulated comparing going single vendor to the holocaust. I kid you not. Even worse, the bill’s sponsor was Jewish. I’ve met many friendly people who are part of the 403(b) industry, and I would not lump them into that category, but the industry does weird things when cornered.
In 2020, the insurance company Horace Mann was fined for sabotaging a single vendor transition by the state in Delaware. A representative encouraged their customers to stop their contributions to the new, low-cost state 403(b) and instead contribute to an IRA (usually a Roth) invested in a much higher-cost variable annuity. You can read about it here. Yes, industry sabotage is real and may account for lower contribution rates if the sample size of plans is not large enough.
I’ve identified six factors that could affect contribution rates, none of which involve the number of vendors. I’m willing to bet that if I spent more time, I could have come up with six more. The NTSA is claiming something for which it doesn’t have evidence. They don’t provide the data for others to verify and test or propose alternative hypotheses for why the data may show what it shows. One reason is that they want you to trust that their narrative is correct.
Even if the above claim was valid, it says nothing about the quality of the products.
It’s interesting how the NTSA doesn’t say a word about the quality of the product or the fiduciary nature of advisors.
Claim:
“Single provider arrangements have the lowest participation rate; 8% below the national average.”
Response:
No, they don’t.
It’s been said that what can be claimed without evidence can be dismissed without evidence. Nothing presented provides conclusive evidence to support the claim that single-vendor programs have lower participation rates.
Most state and local government plans, which are very similar to public school employee plans, are single-vendor plans. They generally have higher participation and contribution rates. Single-vendor alone does not result in lower participation rates, just like multiple vendors don’t lead to higher contribution rates. There are many factors involved.
This is not a study. No data is provided. There is no peer review. It has not been submitted to a journal for publication. This is propaganda. It’s a sustained campaign to mislead to keep payroll slots open for the vendors that fund the NTSA and to allow the member compliance entities to keep earning an income from those vendors for providing the compliance. Follow the money.
Claim:
“Simply stated, the data reveals a positive and significant correlation between the number of choices/advisors and participation. While there is obviously a point of diminishing returns, in general, as access to these resources increase, participation and savings increase too.”
Response:
If accurate, provide the data and the statistics to back it.
Why is the NTSA lumping “choices” and “advisors” together? The data didn’t test for the number of advisors; it tested for the number of vendors. In addition, the term advisor has a different meaning to the NTSA than it does to the public. When the NTSA incorrectly uses the term “advisor,” they refer to ANYONE who helps educators with their 403(b). This is a profoundly misguided use of the term. An advisor is someone who is acting in a fiduciary capacity.
The overwhelming majority of people working to sell 403(b) products only sometimes work in a fiduciary capacity when selling those products and thus cannot be referred to as advisors. Those selling 403(b) products generally fall into two categories: insurance agents and brokers (Registered Representatives), neither of which has a full-time fiduciary duty. These representatives might tell you they must act in your best interest, but the standard they are referring to either doesn’t apply to non-ERISA 403(b) sales or is not the same as someone who is acting as a fiduciary at all times. These people might also be registered as investment advisors but do not always have to act in that advisory role.
Someone who is a Registered Representative (a broker) and an Investment Advisor Representative would be referred to as Dual Registered. Historically, there have been issues with this way of registering, as Nicole Boyson describes in her 2019 paper titled “The Worst of Both Worlds? Dual Registered Investment Advisors, linked here.
If the NTSA wants to argue that education helps with enrollment, we’re happy to agree with them. However, we also believe that education needs to be objective and that the investment options available must be high-quality.
Survey Data
Many of the NTSA’s conclusions are based on a survey. However, what was distributed was a questionnaire. In research, a questionnaire and a survey are two different things. A questionnaire is one possible piece of data collection within a survey. The NTSA wants everyone to believe their document is an objective research paper, but it is not. Using the term survey or questionnaire seems to be a deliberate attempt to mislead; if you’d like to learn more about the difference, you can read about it here. The following is what we know about the questionnaire:
1. Only distributed to schools that used the four compliance administrators; it did not go out to all schools.
2. Ten questions were designed to be answered in less than 5 minutes.
3. Participation was voluntary and confidential.
4. Participants could decline to answer specific questions.
5. The questions were not made available to the public.
6. No data concerning demographics or geographics was released.
7. No methodology was released concerning how the data was turned into statistics.
The data used in this paper to push the narrative that multi-vendor 403(b) plans are always better is based on flimsy, untestable data. Like many of the vendors that are sponsors of the NTSA, there needs to be more transparency.
Because the data is so questionable, none of the conclusions can be taken seriously.
Claim:
“Additionally, recent research by AXA Equitable Life regarding provider choice, shows where districts offer a choice of providers, participants are significantly more satisfied and confident in their 403(b) plans than participants in single provider districts.”
Response:
The footnote to this claim is “Multiple vs. Single Provider Topline Report,” Zeldis Research Associates. 25 September 2018.” As far as we can tell, this research does not exist or at least is not available to the public in any meaningful way. I can find only two cites of this research on all of Google; both are from the NTSA distributed document citing this research.
This claim must not be taken seriously, given the lack of transparency surrounding the data. No peer-reviewed paper could get away with making claims not backed up by transparent research.
Claim:
“Three in five participants with choice of provider are more satisfied with the performance of their 403(b) account.
Response:
Where provider choice is offered, participants are more confident in the quality of their investment choices (61% vs. 43%) and in their ability to meet their retirement goals (66% vs. 55%).”
There is no evidence to support these spurious claims. As mentioned in the previous claim, this “data” comes from an AXA/Equitable research piece that is not publicly available or peer-reviewed.
These claims are spread in actual testimony before legislatures. This is concerning.
Claim:
“While further research would assist in validating the results, the data reveals a correlation between limiting advisors and investment providers and a decrease in the level of participation by employees in 403(b) plans.”
Response:
Correlation is not causation. Correlation is not causation. Correlation is not causation. We can’t even be sure the data is reliable, yet the NTSA wants you to believe that a correlation exists and is, in fact, causation. At least they say that further research is needed to validate the results, indicating that even the NTSA doesn’t believe the research has been validated. Do you see the sleight of hand?
Step 1 Design something that resembles academic research.
Step 2 Write up the “research” into a paper that makes factual claims based on that research.
Step 3 Pretend that this paper is reliable.
Step 4 Spread the paper to school districts, administrators, unions, and politicians.
The “research” is not reliable. The NTSA did not prove its claim that reducing vendors reduces participation and that the reduction is primarily related to losing access to a sales agent. You would never know it by reading this paper.
Claim:
“This does not appear to be caused by alternate factors. Thus, the data suggests that fewer employees participate in 403(b) plans when the number of available investment providers and access to trusted advisor resources is limited.”
Response:
I’ve listed several alternate factors that could explain reductions in participation when vendors are reduced. The NTSA produced no research to support this claim; the only other factor considered was income, which was spoken of only once.
“Trusted advisor resources” is meant to equate all insurance agents and registered representatives with actual advisors who have a fiduciary duty. Some fiduciaries work in the 403(b) space but rarely work for vendors. The ones that work for vendors generally do not always act in a fiduciary capacity. The continued conflating of the term “advisor” with salespeople is purposeful.
Claim:
“Further, the research shows that a disruption of the advisor-client relationship can have a drastic effect on participation rates. When public educators no longer have access to the option in which they chose to save, as well as the professional assistance of a trusted advisor, they stop saving in the plan.”
Response:
Here is another example of the NTSA conflating “advisors” with salespeople and drawing conclusions from unproven evidence.
The NTSA fails to point out that even when a school district becomes single-vendor or reduces vendors, the person servicing the educator's accounts stays. Those accounts still exist. If the representative on those accounts disappears, the issue is with that representative, not the vendor reduction.
As mentioned previously, the NTSA hasn’t presented any evidence to demonstrate that there are actual “Advisors” working with most participants. An advisor is someone who is committed to working as a fiduciary at all times; this is exceedingly rare in the 403(b) space. Look at the most prominent vendors in this space; very few of their representatives always work as Investment Advisor Representatives.
If the NTSA infers that losing access to a salesperson leads to participants no longer wanting to save toward retirement, it has yet to provide any evidence.
The case in Delaware is evidence that the salespeople the NTSA is referring to are actually the ones who advise their clients to stop saving in a 403(b) when those plans become a single vendor. This represents yet another alternative factor the NTSA failed to investigate. The representative worked for Horace Mann, and the spokesperson for Horace Mann said in a PlanSponsor article, “The Investor Protection Unit of the Delaware Department of Justice undertook an investigation related to Delaware’s 2016 403(b) transition to a sole provider. In that transition, other providers, including Horace Mann, were no longer able to sell 403(b)s in the public schools in the state. One former Horace Mann representative responded to the change by soliciting existing Horace Mann 403(b) clients to open IRA accounts if they wanted to continue saving for retirement with Horace Mann.”
Whether this transition to a single vendor affected participation rates is unknown, but if it did, the irony is that at least one of the factors was the “professional assistance of a trusted advisor.” This salesperson was not a fiduciary advisor at all times.
The NTSA isn’t interested in any outcome that doesn’t produce or endorse a multiple-vendor system.
Claim(s):
Example 1
“A Pennsylvania school district experienced a 40% drop in participation after reducing investment provider options.”
Response:
This is anecdotal evidence.
No data is provided to back up this claim.
What school district?
No case study is presented to determine whether there was a 40% decrease in participation and whether such a decrease could be attributed to reducing vendors rather than other factors.
What year did this happen?
How was the transition managed?
The NTSA is making a claim without providing evidence to support it.
Example 2
“A Maryland school district went from 10 investment providers to one, and saw the number of active participants drop from 1,000 to 775. In 2016, the district increased to four investment providers and has since seen an increase in participation.”
Response:
This is anecdotal evidence.
No data is provided to back up this claim.
What school district?
No case study is presented to determine whether there was a drop in participation and whether such a decrease could be attributed to reduced vendors rather than other factors.
What year did this happen?
How was the transition managed?
The NTSA is making a claim without providing evidence to support it.
Example 3
“A Florida school district went from 12 to 5 investment providers and lost more than 1,000 active contributors, all of whom had not resumed participation since the change at the time of the survey.”
Response:
This is anecdotal evidence.
No data is provided to back up this claim.
What school district?
No case study is presented to determine if there was a decrease of 1,000 active contributors and if such a decrease could be attributed to reducing vendors and not other factors. What if this is a very large school district (as tends to be the case in Florida), and the transition could have been managed better? What if there was also an incentive to get educators to retire that year?
What year did this happen?
How was the transition managed?
The NTSA is making a claim without providing evidence to support it.
Example 4
“262 school districts in Michigan went from 16 to 5 investment providers in 2009. Participation went from 23,000 to fewer than 17,000 active contributors. Today, after adding back several investment providers, they are finally back to 23,000 participants.”
Response:
This is anecdotal evidence.
No data is provided to back up this claim.
What school district?
No case study is presented to determine if there was a decrease in participation and if such a decrease could be attributed to reducing vendors and not other factors. Can you think of anything else going on in Michigan, the country, and the world in 2009 that could have led to a decrease in participation? I’m sure the NTSA has heard of the Global Financial Crisis; this couldn’t have been a contributing factor, though. Yes, that was sarcasm.
What year did this happen?
How was the transition managed?
The NTSA is making a claim without providing evidence to support it.
Example 5
“In 2009, Iowa transitioned from a competitive, open 403(b) marketplace model to a narrow number of five options, only to see participation rates in the program plummet dramatically – with some counties showing enrollment reductions of up to 50%. The number of investment choices was recently expanded to 30 approved companies with the hope that the workers will once again save for retirement in the 403(b) plan.”
Response:
This is anecdotal evidence.
No data is provided to back up this claim.
What school district?
No case study is presented to determine whether enrollments decreased by 50% and whether such a decrease could be attributed to reducing vendors rather than other factors.
It’s not challenging to develop alternative hypotheses about what could have led to a decrease, assuming such a decrease happened. The most significant financial crisis since the Great Depression occurred during this period. You cannot expect a rational person to believe the Global Financial Crisis did not have at least something to do with a drop in enrollment.
It’s unclear if participation dropped by 50% or the rate of people enrolling dropped by 50%. If the latter, it would make sense that the rate of new enrollments dropped. It was 2009, and educators were being laid off left and right and seeing their income cut. Educator spouses were losing their jobs. It seems to me that enrolling in a 403(b) might not have been their top priority.
As Mark Twain once said, “Lies, Damn Lies and Statistics.”
How was the transition managed?
The NTSA is making a claim without providing evidence to support it.
Claim:
“School district employees need to be given adequate information to make informed decisions about their 403(b) plans. This information should include the services associated with the various offerings so participants can choose the option that best suits their needs. For example, some participants want a very low-cost option with limited personal service, while others want to work closely with an advisor on a one-on-one basis and are willing to pay for that level of service. In general, a key factor to increasing participation rates is to offer employees more choice, not less.”
Response:
This seems more like an argument for objective education paired with a low-cost, single-vendor 403(b) plan that offers the option of working with a fiduciary advisor.
The NTSA asserts, “In general, a key factor in increasing participation rates is to offer employees more choices, not less.” They can assert this all they want, but it’s not been proven. As has been said for hundreds of years, “If you repeat a lie long enough, it becomes the truth.” This paper aims to use scant evidence to support a preconceived notion that benefits the people putting this paper out to the public and then to repeat this preconceived notion over and over until it becomes a dogma.
The NTSA is making a claim without providing evidence to support it.
Claim:
“Our research showed that there is a 5% increase in participation where employees have access to retirement education often provided by advisors at work.”
Response:
We agree that objective, non-sales-based education is likely to increase participation.
Claim:
“Furthermore, 7 out of 10 workers who have zero retirement savings do not have access to any retirement education from their employers. Employers that do not provide access to retirement education have employees who report higher levels of anxiety and fear about retirement.”
Response:
Great, let’s provide objective, unbiased retirement education to our nation’s educators! I cannot find the study mentioned in the footnote for this claim. The apparent claim says nothing about single versus multiple vendors. The claim does appear to support objective education and that it might be a factor in participation rates.
Claim:
“Numerous studies have indicated that employees want help with their retirement plans.”
Response:
We have no doubt employees desire more education and help with their retirement plans, but no peer-reviewed studies have been cited for this claim. When participants are confronted with the long-term costs of such “help,” would they have a different view? We can’t know; that wasn’t studied.
Claims:
“Using an advisor has measurable financial benefits to the 403(b) participant. Recent research conducted by AXA Equitable Life regarding the value of an advisor in K-12 participant preparedness and 403(b) account performance showed that:
• Participants who use an advisor have nearly double the median account balance than those who do not.
• Participants who work with an advisor have 15% greater diversification of assets.
• More than half of participants in the AXA Equitable Life study attribute an early start in saving to the influence of a trusted advisor.
• Participants who use an advisor contribute 49% more annually on average.
• Participants who work with an advisor are significantly more likely to have increased their contributions 24% more often since opening their account.
• Participants report higher satisfaction (72% vs. 54%) with their 403(b) plan overall and higher confidence (64% vs. 56%) in meeting their retirement goals when working with an advisor than those who do not.“
Response:
There is no citation for this research. Suppose it’s connected to the earlier-mentioned research. In that case, we can only say what we said before: The footnote to this claim is “Multiple vs. Single Provider Topline Report,” Zeldis Research Associates. 25 September 2018.” As far as we can tell, this research does not exist or at least is not available to the public in any meaningful way. I can find only two cites of this research on all of Google; both are from the NTSA distributed document citing this research.
In 2016, I contacted Equitable (then AXA) to get details about the study. I was told the following in a voicemail from one of the authors on March 2, 2016:
“Hi Scott this is John Cline at AXA. We've never intended to release the full study because it's an internal study or publish methodology other than what we've already provided. So we're not making it more public that has been already the study. The largest part of the study was basically for internal use and we only publish the highlights in the white paper. So that's the situation I just wanted to let you know since you had written about it to me last night. Thanks very much.”
Mr. Cline was very polite and then forwarded me a presentation made at a SPARK conference on November 15th. I’m hosting the presentation pdf here, which does not support the above data.
There is another AXA (Equitable) survey that was released titled “The Value of the Advisor: The Impact of Advisors on Financial Outcomes Among K-12 Educators,” but any version of this survey during this time period has been scrubbed from the internet and Equitable’s website (there is a 2022 version here). You can find it referenced in a few articles, including here, but the link within the article takes you to a blank page. There is a summary of the survey, but there is no data to back it up; you can find that summary here. Research that is peer-reviewed and high quality doesn’t just disappear. I challenge the NTSA to provide ALL the data behind their claims and make it publicly available.
We’ll start with the fact that the term “Advisor” is once again used to encompass any support regardless of fiduciary duty. This in itself is highly misleading and should debunk the research. The difference between this survey and the NTSA study is that AXA is honest about the term not referring to an actual investment advisor. On slide 3 of the SPARK presentation, they disclose the following:
“For purposes of this discussion, “advisor” is used as a general term to describe insurance/annuity and investment sales and advisory professionals who may hold varied licensing as insurance agents, registered representatives of broker-dealers, and investment advisory representatives (IAR) of registered investment advisors, respectively. “Advisor” in this context is not intended to necessarily refer to IAR-offered financial advisory/planning services.”
In Equitable’s summary of their “The Value of the Advisor” survey, they disclose:
“The use of the term of “financial advisor” or “advisor” for purposes of the survey questions and responses by both the consumers and the financial advisors queried does not necessarily imply that the individual is a registered investment advisor (RIA). The use of these two terms is meant in a general sense of the word or phrase to describe working with an investment advisor, a licensed insurance agent or other financial professionals who may sell annuity products.”
In terms of claims, the first one is:
“Participants who use an advisor have nearly double the median account balance than those who do not.”
Response:
This appears to be untrue. Since the NTSA has yet to cite their research, we can only assume that the research provided by AXA directly to 403bwise is the research cited by the NTSA. The NTSA could quickly clear this up by releasing the data. They’ve had six years.
The actual research, which was just an online questionnaire and, as far as we can tell, was not peer-reviewed or well designed and says something entirely different than what the NTSA claims. The median account balance for “use an advisor” is $23,500. The median account balance for the “advisor option, but don’t use” is $17,600. For those who are math-challenged, that is not nearly double; it’s about 34% higher. Where is the NTSA getting their data?
The second claim is:
“Participants who work with an advisor have 15% greater diversification of assets.”
Response:
There is no available data in the study to support this claim. I’ll point you to page 12 of the SPARK presentation. Not only does the evidence not exist within this body of work, but it’s unclear how the question was asked is effective at providing a statistically significant answer. It also says nothing about how different plan structures might influence greater diversification or what the phrase “greater diversification” even means. The term “advisor” is a catch-all for anyone who provides help and is entirely defined by the person answering the survey.
The third claim is:
“More than half of participants in the AXA Equitable Life study attribute an early start in saving to the influence of a trusted advisor.”
Response:
Remember that the term “trusted advisor” is not used in the survey. The term “advisor” refers to anyone who might contact a participant regardless of fiduciary responsibility. The conclusion shouldn’t be that a “trusted advisor” helps get educators started earlier; it should be that providing education and help with enrollment gets educators started earlier. This should come as no surprise. They didn’t study whether objective education or auto-enrollment can do a better job at much lower costs. They never propose alternatives that could produce similar or better results at lower costs.
Further, this claim is misleading. The data said 34% “started to contribute much earlier” and 33% “started to contribute somewhat earlier.” What does “much earlier” mean? What does “somewhat earlier” mean? Whatever the respondent believes it to mean. The questions were asked in a manner that elicited a desired response. The survey data is not interesting because it’s not disinterested; AXA (Equitable) wanted specific outcomes and designed the survey to achieve them, in my opinion. Even then, the data isn’t very robust and certainly isn’t reliable.
You can find the data on page 11 of the SPARK presentation.
The fourth claim is:
“Participants who use an advisor contribute 49% more annually on average.”
Response:
Once again, no data is available to back this claim. A survey was designed to elicit a desired outcome and promote a narrative that helps preserve the status quo.
The fifth claim is:
“Participants who work with an advisor are significantly more likely to have increased their contributions 24% more often since opening their account.”
Response:
No data has been provided to support this claim.
The sixth claim is:
“Participants report higher satisfaction (72% vs. 54%) with their 403(b) plan overall and higher confidence (64% vs. 56%) in meeting their retirement goals when working with an advisor than those who do not.“
Response:
One curious aspect of this survey is which companies are classified as not working with an advisor. If we use the SPARK conference presentation and look to pages 36 and 37, we find that those who said they weren’t working with an advisor had accounts with firms that would meet the survey’s definition of an advisor. Only one vendor on the list doesn't offer an advisor as defined by the survey: Fidelity. What is strange about this is that Fidelity ended up on all three classifications.
There are three classifications, as follows:
Use an Advisor
Advisor Option But Don’t Use
No Advisor Option
The question was, “Now looking at your statement online or a recent paper statement, which company provides the 403(b) that you are currently contributing to?” Fidelity was the number one answer for all three possible answers to this question. Traditionally, at least in K-12 public 403(b) plans, Fidelity is not an option you use with an advisor. Of course, AXA has expanded the definition of advisor for this survey to include just about anyone who works for or with a vendor.
The study lists all the vendors but doesn’t determine how many participants contribute to each. There seems to have been massive oversampling, and they did not get a representative group of educators. The survey doesn’t seem to distinguish between public and private K-12.
Within this context, the above claim seems to provide evidence that it’s not multi-vendor that increases participation or satisfaction; it’s the ability to receive some help along the way. It doesn’t need to be from an investment advisor.
Another takeaway is that almost every vendor included in the survey offers the use of an advisor, as defined by the study. Yet, that advisor is not used, or the participant doesn’t know an advisor option is included (again, almost none of these vendors offer an Investment Advisor Representative who acts as a fiduciary 100% of the time). AXA (Equitable) was the fifth-ranked vendor in the “Advisor Option But Don’t Use'' category and 14th for the “No Advisor Option,” meaning that a large number of people with a vendor that would be considered as offering an “Advisor Option” either don’t use the advisor or don’t even know one exists. No wonder AXA (Equitable) didn’t want the details of this survey published.
Even when vendors that offer a very loose definition of “advisor” are used, a large chunk of the participants with those vendors don’t use or don’t know they have such an option.
Someone must explain how Fidelity came out on top of the “Use an Advisor” option.
The bottom line is that the evidence does not support this claim unless we completely redefine what the term “Advisor” means. The evidence might support the claim that when help is provided, satisfaction increases.
We found the following Equitable survey, conducted in 2022 of 1,001 U.S. K-12 educators who contribute to a 403(b) plan: “The value of the advisor.” No methodology or underlying statistics are provided, and the survey has not been published in any peer-reviewed journals.
Claim:
“Matching contributions are a key driver of retirement savings. The research found a 23% increase in participation where employers match contributions. However, only 6% of school districts currently do so.”
Response:
The NTSA finds a factor that dramatically increases participation, and it’s almost an afterthought in the research. There was no further insight into how matching can increase participation, what rates work best, how a match should be structured, or how plans that offer matches are structured. While my bias is to believe that matches increase participation, they didn’t do enough analysis even to determine statistically whether it was the match that increased participation as they didn’t look to what other factors might have affected those plans.
The paper concludes with a call to use an independent third-party administrator and fails to define what that means. It then recaps the claims that I’ve debunked here.
Conclusion
This comprehensive survey is anything but. The evidence offered did not support the claims, and no reputable journal I’m aware of would allow this type of research to be published.
If you think debunking propaganda like this document is a waste of time, consider that this information is being spread to school districts, associations, and even the Federal government. It can be found in a CT ASBO presentation and a Federal Government GAO paper (here).
Scott Dauenhauer, CFP, MPAS
The Tortured Finance Guy Department