cosmic echo
On any given day on LinkedIn, posts from bootcamp graduates months into a fruitless job search are likely to appear. Their stories are so similar in plot that the idiosyncrasies of their job search and work history come off more like a tweak to a template than a portrait of their struggle. Their messages proliferate like a cosmic echo rather than the plea of a friend.
the dream
When you visit a bootcamp you notice a real buzz. Why? Well, it’s a fantasy world. Where else can you find a group of 20 adults who have both no coding experience and a belief they will be peers of those with decades of such experience in a matter of months? Nowhere. Why is acknowledging this belief important? Because that’s the belief they are sold and that’s the belief that should be the lens through which the data is interpreted.
There are certainly positives about having an atmosphere of organic excitement in a classroom: when students are excited, they are more creative and less discouraged by obstacles. That being said, it’s up to the educators to tether the excitement to reality: there is a fine line between euphoric imagination and actual learning, and only educators are competent enough to draw that line.
In bootcamps there is a conflict between the students’ experience and the students’ expected outcomes. Or rather, a conflict between the type of instruction that should be occurring and the type of experience bootcamps promise to provide.
I remember when the instructors at my bootcamp started to peel back the curtain of the job market prospects. They sat us down in the classroom and began to address us in a tone void of the campy enthusiasm so present in earlier periods. “It might take some time to find a job” was the crux of the talk. After the lecture, the attitude of the students shifted from jovial and optimistic to fearful and competitive. In the same way that euphoria and naivety are not the right combination of attitudes to hold while learning, neither are fear and competition. There are healthier options that can be fostered from the beginning.
The point of this article isn’t to defame bootcamps; it’s to call for a shift in attitude towards them. It is to let a little air out of the spectacular balloon of promise which bootcamps present to potential students. And conversely, for potential students to have that wakeup call earlier rather than later. Lastly it is to plant the seeds of concern for the philosophy of education. While outcomes data make it easy to demonstrate large scale patterns and tell stories, it does not provide a solution. For that we need to understand the purpose of education, the philosophy of our institutions and how the former can be integrated into the latter organically.
results, results, results
Bootcamps publish their outcomes in the form of Jobs Reports. Because of their popularity, there are currently hundreds of bootcamps. This analysis takes a look at four of Course Report’s “Featured Schools” and 2 of SwitchUp’s “Top 10”.
I chose to leave the names of the bootcamps out of the article because the reporting processes and results appear to be industry-wide. Therefore it doesn’t seem fair to publish harmful details about the randomly selected schools.
the theme
Download your favorite bootcamp’s Job Report and you will notice something strange: pages and pages of narrowing down the pool of students considered eligible for reporting. Each bootcamp has its own steps and criteria, but the systematic approach of excluding students who are harmful to the results is paralleled in all reports. The process goes something like this:
a. Start with a population — “Our program started with x number of students”
b. Define a subpopulation using a vague copyrighted term — “Of x students, x% were considered ‘job seekers’ “.
c. Make a high percentage claim about the subpopulation — “90% of students considered x graduated.”
d. Repeat c. and d until the population consists of a high enough percentage of graduates who accepted full time offers within a specified time period (usually 6 months).
The loaded terms at each step entail a number of reasons for excluding students. Some are justified and some less so, but the numbers that are presented to potential students on websites and advertisements are only the most favorable interpretations: the ones which exclude the most students from consideration.
When you artificially exclude students from the “eligible” population, you backhandedly increase the percentage of “successful” students because they are a bigger part of a smaller pool.
After all of the steps of redefining the population, less than 50% of the original students are even being considered for reporting.
Here are the most common ways students are excluded:
Non Graduate
● The student does not make it through the program for one reason or another.
Out-of-field offer
● The Graduate obtains an offer within the Job Search Period (but does not meet the classification requirements for In-Field) i.e. back to Starbucks.
Continuing education
● The Graduate elects to continue education within the Job Search Period. Do online tutorials count?
Not seeking a job for health, family, work authorization, or personal reasons
● The Graduate elects to cease a job search during the Job Search Period due to health, family, work authorization, or personal reasons.
On the surface, the categories are reasonable, but the unfortunate reality is that there isn’t enough visibility into how students are placed in them to take a hard stance. But as suggested in one of the sheets provided by one of the bootcamps themselves:
The last population we haven’t touched in is non-responders. These are the students who don’t respond to Career Services. Non-responders are excluded from results data in every report. But in reality, most non-responders are probably not success stories. Nor are they a small category.
the outcomes:
As mentioned earlier, I chose to leave the names of the bootcamps out of the article. The systematic approach was so similar among the 6 reports I read, that I have to assume it is industry-wide. Therefore it seems unfair to single out and name only a few of the institutions.
Bootcamp #1
For the years 2019 and 2020, there were 854 students who were enrolled in the program. 749 students were graduates considered “Available for Employment”. The number of students who were employed in the field was 479. The number of students who received full time offers making $90k or above was 261. That means the percentage of enrolled students who received a full-time offer making $90k+ was 30.5%. The percentage of starting students who received an offer making $70k+ was 45%.
Claims on the website:
Bootcamp #2:
For 2020 the number of enrolled students across this bootcamp’s 8 campuses and 2 remote programs totaled 731. The number of graduates considered “Job Seeking” was 536. The number of students who received a full time offer making 90k or more was 210. That means the probability a student will graduate and earn a full time offer of 90k+ is 28%.
That is a stark contrast to what is displayed on the “Outcomes” page on Bootcamp #2’s website.
Bootcamp #3:
For 2020 the number of students enrolled in Bootcamp #3’s On-Campus and Online programs was 2,262. Of those students, 1,942 graduated.
1,510 completed a Job Search Cycle, 1,293 accepted a job offer within the reporting period. 432 were considered “not job-seeking”. 950 accepted a full-time salaried role. 246 accepted a full-time contract, internship, apprenticeship, or freelance role.
So, if you include the full “addressable population” of 2,262, the percentage of students who received a full-time salaried offer was 42% (950/2262). And of those 42% the average salary was $69,895. Considering all the students who received non full-time jobs, the percentage climbs to 52.9% (1196 / 2262). Bootcamp #3’s website boasts an 86% employment rate:
D. Bootcamp #4:
“Let’s Talk Real Results” is plastered front and center on Bootcamp #4’s outcomes page, followed by a claim that their data is “thoroughly analyzed by a third party called Social Finance.”
Social Finance just happens to be the name of the company that invested $5.8 million in their bootcamp, so I’m sure there is no conflict of interest.
They even give you a guide to “Learn to decode bootcamp outcomes” where they warn against all of the bad practices that they themselves partake in, including:
-Outcome data should be evergreen and fresh. Look for a track record of consistently providing outcome data. Avoid spotty outcomes based on one six-month period from two years ago.
-Be wary if a school only offers anecdotal outcomes based on a few individual stories.
They warn students to look out for all of these behaviors from other bootcamps, hopefully people have enough sense to turn the magnifying glass back on them. They do not actually publish any results. There is nowhere to access the underlying data for the high level outcomes they display. Therefore you have to trust that they are telling the truth when they say 88% of grads make an average of $80k.
E. Bootcamp #5:
For 2019 there were 4,313 students enrolled at Bootcamp #5. 3,663 were considered graduates. Only 2,276 (were considered “Job Seekers”. So only 52% of the original cohort were even considered job seeking. There is no average salary, no description of how many graduates received full time roles, etc. The only thing they say is that 90.5% of graduates who participated in their “Full-Time Career Services program” accepted a job in their field of study within 180 days.
F. Bootcamp #6:
For 2020 only 86 graduates were included in the report and they don’t mention the number who started the cohorts.
For 2017 and 2018, there were 361 students enrolled. 285 were “available for graduation”. The number of students who received an offer making 80k or more was 106. That is 29% (106/361) of the original cohort. No average salary is reported.
the end
The success rates of bootcamps vary greatly depending on the population you use for your interpretation. If you use the number of students who started the program compared to the number who were deemed eligible for reporting by the bootcamp’s themselves, the difference is so drastic that it often brings a success rate of over 80% to well below 50%. There are justifications for not including students in results data, but it’s hard to imagine there being any that justify excluding over half of the population. This to me screams that bootcamps are using the wrong metrics for success and are philosophically unaligned with their students. Those other 50% are real people too: people who pursued a dream, a promise from an institution. They quit jobs, spent tens of thousands of dollars, and spent months of their lives on this pursuit. They deserve to be included.
The data discussed in this article comes directly from the Jobs Reports of the bootcamps published reports. If you feel like I misrepresented anything, missed important considerations or flat-out disagree, let me know. If you’d like the names of the bootcamps and/or the links to the jobs reports used for the article feel free to reach out.
Comment below if you want to hear about specific bootcamps or want to share your opinion on the bootcamp industry.
Original article here:
https://medium.com/@osoh_app/coding-bootcamp-promises-vs-real-world-outcomes-a-50-success-differential-c9443173f5a1
One Response
Hi There