A certification system to replace high-school and college.
With the explosion in independent study on all education levels, certification is the main missing piece. One solution is tests. For example, Pearson’s is offering this service to Udacity students. However, certification-by-testing has had a hard time getting prestige. In the high-status parts of the software industry, getting Java/Microsoft/etc. certification is a slight negative on your job value—i.e., one is expected to countersignal.
So, we need a certification system that succeeds at serving as a signal.
What successful examples can we find? The actuarial industry has a system of advancement with ten exams. There is no requirement to get a certain degree to take them. The top level is considered an intellectual achievement roughly equivalent to a PhD.
Perhaps the certification we’re offering should test useless skills which require a long time to acquire, proving that one is not just smart but hard-working. Compare Latin in earlier periods, and the software language Scheme (a language used mostly for theory, not for product development) in the software industry today.
The usual trappings of signaling, like association with prestigious people, would be an essential part of the marketing.
Isn’t that a bit what Stack Overflow is doing with their careers program? “I have 10 000 points on Stack Overflow” is certainly a sign of quality (more than a degree in CS from an average school, or 10 000 points on Reddit or LessWrong); plus potential employers can verify how exactly those points were obtained.
Perhaps the certification we’re offering should test useless skills which require a long time to acquire, proving that one is not just smart but hard-working. Compare Latin in earlier periods, and Scheme (a language used mostly for theory, not for product development) in the software industry today.
Latin was far from useless in ‘earlier periods’. It allowed educated people from all over Europe to understand each other and contribute to a unified body of knowledge, much like English does today (but for much more than just Europe).
In the high-status parts of the software industry, getting Java/Microsoft/etc. certification is a slight negative on your job value—i.e., one is expected to countersignal.
Why is that? That wouldn’t have surprised me too much if it had been about about academia, or about the free/libre/open source software community, but software industry… why?
Because it signals that you’re the sort of person who feels a need to get certifications, or more precisely that you thought you actually needed the certification to get a job. (And because the actual certifications aren’t taken to be particularly hard, such that completing one is strong evidence of actual skill)
And because the actual certifications aren’t taken to be particularly hard, such that completing one is strong evidence of actual skill
OK, I get it now. I don’t list my ECDL (which I took in high school) in my CV because i think it’s so basic that potential employers (who have any kind of clue) would think “huh? so what?”, but I assumed that Java/Microsoft/etc. certifications were nontrivial to get.
There’s that, and there’s also (from personal experience) an element of superhero bias (or bias overcompensation? I forget which way this one goes), where basically someone who does not have a certification but can code something optimally is de-facto superior to someone who does have a certification and codes the same thing just as optimally.
Additionally, there may be some reciprocate signaling involved; if I look for certified programmers, people will see mere certification as sufficient to get the job, which is not what I want—I want people who have the actual ability. Thus, I should hire people with ability but no certification, which signals that the certification is “useless” or “not what we’re looking for” relative to other criteria.
This seems to even out to a reflective equilibrium where official certification is a net negative.
The actuarial industry has a system of advancement with ten exams.
Perhaps this is the key. Instead of coming up with our own replacement certification system, maybe we need to make it easier for companies and industries to create their own. They’re the ones who know what matters for their own fields.
As an entry point, one might create an online job application builder. Questionnaires are easy (and probably not worth a startup), but if the application could have “code this” questions, and the answers were checked on the server, that could be a killer feature for tech companies.
Yes, but supervision is essential for the tests to be reliable. The basic solution to that is to set up hundreds of in-person test centers, with proctors, as Pearson has. On-site testing can be minimized with various imperfect techniques, like letting people take some tests at home with cameras showing that they aren’t cheating, and then using on-site tests as final confirmation of scores.
As a start up, having hundreds of centers might be a bad idea. See if you can make deals with local libraries/ YMCAs/ churches/ schools/ even local businesses that sometimes hold classes, like Michael’s and see if you can test there, while you are still growing.
Even with that, there might be easier ways. I remember going to school in NC at a Community College online. I had a presentation I had to do as a project, The school was like, 3-4 hours away and the presentation had to be in the early morning. She let me do it over the phone.
That doesn’t scale up very well, obviously, but your main advantage early on as online is being able to tap into the market all over the US and beyond. You might have one state with less than a dozen students. It’d be terrible to have to travel hours, early in the morning to one building that’s only open/ needs to be open maybe seven days a year.
This, do-super-want. Perhaps a more specific version/implementation/tactic would be compulsory education alternatives. Of course, the signalling part remains a major problem.
One other possible element of the signalling problem is to counter a particular subset of common responses that seem particularly available: “What makes your special certification any different from all those bogus sham ‘buy-a-high-school-degree-online’ diploma mills?”
Establishing trustworthiness is also made more difficult by the trend that employers don’t really seem willing to verify and learn about nonstandard accreditations. If someone has qualifications that they don’t expect and don’t immediately recognize as a good signal, it’ll be dismissed without further investigation. Targeting employers seems like it would be a requirement of an optimal certification system.
Many countries, including Israel where I live, have long had matriculation tests at the end of high school. Passing them is considered far more important than the high-school graduation diploma, which is given separately.
You can sign up to take the matriculation tests even if you are not in high school. This option is generally intended for drop-outs who are catching up later in life, but you can do it as a teenager too.
After passing the tests, no one cares about your actual high-school grades.
It seems like testing is the go-to idea among contributors thus far for determining whether a person has achieved the level of proficiency in a field that would be commensurate with earning a diploma from a reputable accredited university, but while I have no data to support the following conjecture, I wonder whether electonic testing is even a valid method of determining anything but memory recall at a specific point or set of points in time, if the testing involves multiple steps (i.e. midterms and final).
Why not use projects commisioned or suggested by interested corporations, involving the use of teamwork/teambuilding, leadership, logistics, creativity, and work ethic, while also providing opportunities for prospective employees—people who may have been using Khan Academy alone for years and have not developed the contacts and overall sense of common academic context college students develop over time—to develop those all-important working relationships. Additionally this would allow employers to have more control over the skillsets they actively seek out, and give self-teaching students an opportunity to understand the kind of skills that will actually get them where they want to go in their careers.
Corporations or individuals would use paid accounts to have the opportunity to work with our teams to determine the kind of project that would most help them find the talents they need, and also help determine the conditions of success.
Projects ideally would have practical applications and real-world effects, and any sucessful projects that end up turning their own profit would have predetermined payout models to distribute income between the patron, our company, and the actual prospects who worked on the project itself.
Students who wished to try for a project would could pay a one-time fee to have a lifetime account, and if possible this fee should be able to be covered by as many forms of reputable student financial aid as possible.
It’s 2:00 in the morning and I just got back from Burning Man so I doubt my idea is actually coherent or worth pursuing, but on the off chance it is a good idea, I will just post this now and hope it is productive and promotes thoughtful discussion, if not actual support. That said, if there are any holes in the business model or logic that you post and no one else decides to address them, I will take another crack at it tomorrow.
Collaboration, independent work, etc. are very valuable and are needed.
Supervised tests also have a role to play.
It costs a lot to have an expert grade an entire project.
Tests can be standardized, giving comparable results across ten thousand people. I don’t know if 10,000 people could be useful asked to, for example, develop a PHP email webapp as their trial project without many versions of the solution leaked into the Internet.
Supervised tests minimize the opportunity for cheating.
If someone does a team project, then even if they did do their share, you don’t know what specific skills they have.
Studies of whether such tests correlate with other success metrics show that they do.
Except for a tiny minority of hot-shots (too few to support a business, and they generally find their way in life anyway), the type of independent project that most people are capable of is too trivial to give insight into their abilities.
My primary question then is this: are these shortcomings enough that such a model should completely leave our consideration as an alternative?
My goal with this is to provide choice to employers and ambitious people, and the projects would be things the corporations want to achieve, don’t mind sharing the results of with everyone (think more along the lines of a practical dissertation) and would normally be able to ahieve themselves (and possibly already have a rubric for grading results as these projects must be a normal part of the functioning of such businesses) but do not wish to invest more resources and miss out on discovering new talent simply for a more immediate, guaranteed return on investment.
Also, why not make projects interdisciplinary? The sort of rigorous documentation used for scientific studies could be adapted to the method by which students would be able to make notes and regular progress reports. Additionally, encouraging artists or multimedia focused individuals to make visual or audio documentation of their progress engages more fields in the process and encourages interdisciplinary networking.
I think this idea may be contingent on the development of a much more far-reaching change in the education or possibly corporate models in order to function in the real world, but there are many potential benefits I can see to this.
A certification system to replace high-school and college.
With the explosion in independent study on all education levels, certification is the main missing piece. One solution is tests. For example, Pearson’s is offering this service to Udacity students. However, certification-by-testing has had a hard time getting prestige. In the high-status parts of the software industry, getting Java/Microsoft/etc. certification is a slight negative on your job value—i.e., one is expected to countersignal.
So, we need a certification system that succeeds at serving as a signal.
What successful examples can we find? The actuarial industry has a system of advancement with ten exams. There is no requirement to get a certain degree to take them. The top level is considered an intellectual achievement roughly equivalent to a PhD.
Perhaps the certification we’re offering should test useless skills which require a long time to acquire, proving that one is not just smart but hard-working. Compare Latin in earlier periods, and the software language Scheme (a language used mostly for theory, not for product development) in the software industry today.
The usual trappings of signaling, like association with prestigious people, would be an essential part of the marketing.
Isn’t that a bit what Stack Overflow is doing with their careers program? “I have 10 000 points on Stack Overflow” is certainly a sign of quality (more than a degree in CS from an average school, or 10 000 points on Reddit or LessWrong); plus potential employers can verify how exactly those points were obtained.
Latin was far from useless in ‘earlier periods’. It allowed educated people from all over Europe to understand each other and contribute to a unified body of knowledge, much like English does today (but for much more than just Europe).
Yes, but I’m thinking of the time period of roughly the first half of the twentieth century.
Why is that? That wouldn’t have surprised me too much if it had been about about academia, or about the free/libre/open source software community, but software industry… why?
Because it signals that you’re the sort of person who feels a need to get certifications, or more precisely that you thought you actually needed the certification to get a job. (And because the actual certifications aren’t taken to be particularly hard, such that completing one is strong evidence of actual skill)
OK, I get it now. I don’t list my ECDL (which I took in high school) in my CV because i think it’s so basic that potential employers (who have any kind of clue) would think “huh? so what?”, but I assumed that Java/Microsoft/etc. certifications were nontrivial to get.
There’s that, and there’s also (from personal experience) an element of superhero bias (or bias overcompensation? I forget which way this one goes), where basically someone who does not have a certification but can code something optimally is de-facto superior to someone who does have a certification and codes the same thing just as optimally.
Additionally, there may be some reciprocate signaling involved; if I look for certified programmers, people will see mere certification as sufficient to get the job, which is not what I want—I want people who have the actual ability. Thus, I should hire people with ability but no certification, which signals that the certification is “useless” or “not what we’re looking for” relative to other criteria.
This seems to even out to a reflective equilibrium where official certification is a net negative.
Perhaps this is the key. Instead of coming up with our own replacement certification system, maybe we need to make it easier for companies and industries to create their own. They’re the ones who know what matters for their own fields.
As an entry point, one might create an online job application builder. Questionnaires are easy (and probably not worth a startup), but if the application could have “code this” questions, and the answers were checked on the server, that could be a killer feature for tech companies.
Yes, but supervision is essential for the tests to be reliable. The basic solution to that is to set up hundreds of in-person test centers, with proctors, as Pearson has. On-site testing can be minimized with various imperfect techniques, like letting people take some tests at home with cameras showing that they aren’t cheating, and then using on-site tests as final confirmation of scores.
As a start up, having hundreds of centers might be a bad idea. See if you can make deals with local libraries/ YMCAs/ churches/ schools/ even local businesses that sometimes hold classes, like Michael’s and see if you can test there, while you are still growing.
Even with that, there might be easier ways. I remember going to school in NC at a Community College online. I had a presentation I had to do as a project, The school was like, 3-4 hours away and the presentation had to be in the early morning. She let me do it over the phone.
That doesn’t scale up very well, obviously, but your main advantage early on as online is being able to tap into the market all over the US and beyond. You might have one state with less than a dozen students. It’d be terrible to have to travel hours, early in the morning to one building that’s only open/ needs to be open maybe seven days a year.
This, do-super-want. Perhaps a more specific version/implementation/tactic would be compulsory education alternatives. Of course, the signalling part remains a major problem.
One other possible element of the signalling problem is to counter a particular subset of common responses that seem particularly available: “What makes your special certification any different from all those bogus sham ‘buy-a-high-school-degree-online’ diploma mills?”
Establishing trustworthiness is also made more difficult by the trend that employers don’t really seem willing to verify and learn about nonstandard accreditations. If someone has qualifications that they don’t expect and don’t immediately recognize as a good signal, it’ll be dismissed without further investigation. Targeting employers seems like it would be a requirement of an optimal certification system.
Many countries, including Israel where I live, have long had matriculation tests at the end of high school. Passing them is considered far more important than the high-school graduation diploma, which is given separately.
You can sign up to take the matriculation tests even if you are not in high school. This option is generally intended for drop-outs who are catching up later in life, but you can do it as a teenager too.
After passing the tests, no one cares about your actual high-school grades.
What are the reasons for it?
did anyone try this?
It seems like testing is the go-to idea among contributors thus far for determining whether a person has achieved the level of proficiency in a field that would be commensurate with earning a diploma from a reputable accredited university, but while I have no data to support the following conjecture, I wonder whether electonic testing is even a valid method of determining anything but memory recall at a specific point or set of points in time, if the testing involves multiple steps (i.e. midterms and final).
Why not use projects commisioned or suggested by interested corporations, involving the use of teamwork/teambuilding, leadership, logistics, creativity, and work ethic, while also providing opportunities for prospective employees—people who may have been using Khan Academy alone for years and have not developed the contacts and overall sense of common academic context college students develop over time—to develop those all-important working relationships. Additionally this would allow employers to have more control over the skillsets they actively seek out, and give self-teaching students an opportunity to understand the kind of skills that will actually get them where they want to go in their careers.
Corporations or individuals would use paid accounts to have the opportunity to work with our teams to determine the kind of project that would most help them find the talents they need, and also help determine the conditions of success.
Projects ideally would have practical applications and real-world effects, and any sucessful projects that end up turning their own profit would have predetermined payout models to distribute income between the patron, our company, and the actual prospects who worked on the project itself.
Students who wished to try for a project would could pay a one-time fee to have a lifetime account, and if possible this fee should be able to be covered by as many forms of reputable student financial aid as possible.
It’s 2:00 in the morning and I just got back from Burning Man so I doubt my idea is actually coherent or worth pursuing, but on the off chance it is a good idea, I will just post this now and hope it is productive and promotes thoughtful discussion, if not actual support. That said, if there are any holes in the business model or logic that you post and no one else decides to address them, I will take another crack at it tomorrow.
Collaboration, independent work, etc. are very valuable and are needed.
Supervised tests also have a role to play.
It costs a lot to have an expert grade an entire project.
Tests can be standardized, giving comparable results across ten thousand people. I don’t know if 10,000 people could be useful asked to, for example, develop a PHP email webapp as their trial project without many versions of the solution leaked into the Internet.
Supervised tests minimize the opportunity for cheating.
If someone does a team project, then even if they did do their share, you don’t know what specific skills they have.
Studies of whether such tests correlate with other success metrics show that they do.
Except for a tiny minority of hot-shots (too few to support a business, and they generally find their way in life anyway), the type of independent project that most people are capable of is too trivial to give insight into their abilities.
My primary question then is this: are these shortcomings enough that such a model should completely leave our consideration as an alternative?
My goal with this is to provide choice to employers and ambitious people, and the projects would be things the corporations want to achieve, don’t mind sharing the results of with everyone (think more along the lines of a practical dissertation) and would normally be able to ahieve themselves (and possibly already have a rubric for grading results as these projects must be a normal part of the functioning of such businesses) but do not wish to invest more resources and miss out on discovering new talent simply for a more immediate, guaranteed return on investment.
Also, why not make projects interdisciplinary? The sort of rigorous documentation used for scientific studies could be adapted to the method by which students would be able to make notes and regular progress reports. Additionally, encouraging artists or multimedia focused individuals to make visual or audio documentation of their progress engages more fields in the process and encourages interdisciplinary networking.
I think this idea may be contingent on the development of a much more far-reaching change in the education or possibly corporate models in order to function in the real world, but there are many potential benefits I can see to this.