I've been pondering the idea of digital microcurrencies lately. Specifically, I've been wondering what would happen if an internet site was launched that provided any subscribers the right to create, access, and exchange currencies. The site would serve as a neutral party, process transactions, and provide a set of tools. Each currency would be configurable, and would be required to file a standard set of rules that would determine the size of the money supply, rules for the exchange of that currency for other microcurrencies, and a list of the rules governing the currency - who had the power to modify the currency rules, and under what conditions.
Why, you ask, would this be a good idea? I think that this sort of a project could be useful at a variety of different levels. It could serve as a valuable teaching tool, allowing students and teachers to create accounts, and to make and experiment with markets. It could also serve as a system to facilitate the use of parallel, alternative currencies. Since many people have skills or talents, but have been pushed out of the conventional cash money economy, they could benefit from access to this sort of currency system - a neighborhood or church group or collection of friends could create a common currency, and use that currency to facilitate the exchange of useful services, without having to resort to the use of conventional currencies.
Unlike bitcoin, these currencies would be intended to serve as tools to facilitate economic activity within bounded cultural and geographic areas. A user of the service would pay a modest fee, perhaps 15 dollars/year + some fee per currency managed. This would give a user the right to trade in and hold any currency, subject to the terms of use laid down in each currency's ground rules. No mechanism would be provided for exchanging any microcurrency units for any other variety of money.
Thoughts?
Friday, October 19, 2012
Saturday, May 12, 2012
Economy 2.0, part 1: space vs Ron Paul
I was reading about the progress that's recently been made on artificial leaves recently, and this has gotten me into a somewhat more hopeful frame of mind. Instead of fixating on how we're currently making a mess of things - which we're certainly doing - I've been thinking a bit more about some of the changes that I see coming in the next five or ten years, and the more I ponder the future, the less depressed I am. It seems reasonable to me that we'll solve two big problems in the next decade. I suspect that we're ten years, at most, away from a better renewable energy technology. I don't know exactly what that technology will be, although my money is on some variety of artificial photosynthesis or some other nifty way to harness solar power.
The limiting factor on so many of the things that we do, now, is energy. Remove that limiting factor, and the world suddenly becomes a great deal more hospitable. Case in point: water wars. I've read reports suggesting that we're less than a decade away from major conflicts over water access. In a world where energy costs drop by 90%, however, water is suddenly no longer an issue, as huge desalinization plants become cheap enough to operate around the world. Everyone's life gets a little better. Hell, we can pump enough water to turn the Sahara into a breadbasket, if we want to. I'm amused by the way that this fits with the larger story of the industrial revolution, actually. On so many levels, industrialization has been built on energy technologies. First steam power, then a series of other technologies, greatly expanded the amount of energy at our collective disposal. By doing so, they freed up creative energy in other areas. If I were looking for novel long-term investment opportunities, I'd focus on fields that will really, really benefit from a vast decrease in the cost of energy.
But, you ask, how does this relate to the eternal struggle between space and Ron Paul? Well... my secret suspicion is that the founders of Google must loathe Ron Paul, because they're out to bankrupt him. His financial disclosure statements show that he is deeply invested in gold - surprising nobody. Could it possibly be a coincidence that the diabolical founders of Google are planning to flood the Earth with inexpensive precious metals from space? Doubtful. Of course, the main effect of space mining will be to reduce the cost of many rare metals that are crucial for various industrial processes, but a secondary effect will be to wreak merry havoc on some parts of the commodities markets. I'd be startled if gold doesn't lose at least half of its value within three years after the launch of the first successful asteroid mining expedition... and expedition which will be much easier in a world where energy is no longer scarce enough to serve as a limiting factor...
So... the science geek side of me is looking forward to the future, and not only because we're able to grow replacement organs to spec... although that's pretty nifty, too... What do you plan to do in the post-energy economy?
The limiting factor on so many of the things that we do, now, is energy. Remove that limiting factor, and the world suddenly becomes a great deal more hospitable. Case in point: water wars. I've read reports suggesting that we're less than a decade away from major conflicts over water access. In a world where energy costs drop by 90%, however, water is suddenly no longer an issue, as huge desalinization plants become cheap enough to operate around the world. Everyone's life gets a little better. Hell, we can pump enough water to turn the Sahara into a breadbasket, if we want to. I'm amused by the way that this fits with the larger story of the industrial revolution, actually. On so many levels, industrialization has been built on energy technologies. First steam power, then a series of other technologies, greatly expanded the amount of energy at our collective disposal. By doing so, they freed up creative energy in other areas. If I were looking for novel long-term investment opportunities, I'd focus on fields that will really, really benefit from a vast decrease in the cost of energy.
But, you ask, how does this relate to the eternal struggle between space and Ron Paul? Well... my secret suspicion is that the founders of Google must loathe Ron Paul, because they're out to bankrupt him. His financial disclosure statements show that he is deeply invested in gold - surprising nobody. Could it possibly be a coincidence that the diabolical founders of Google are planning to flood the Earth with inexpensive precious metals from space? Doubtful. Of course, the main effect of space mining will be to reduce the cost of many rare metals that are crucial for various industrial processes, but a secondary effect will be to wreak merry havoc on some parts of the commodities markets. I'd be startled if gold doesn't lose at least half of its value within three years after the launch of the first successful asteroid mining expedition... and expedition which will be much easier in a world where energy is no longer scarce enough to serve as a limiting factor...
So... the science geek side of me is looking forward to the future, and not only because we're able to grow replacement organs to spec... although that's pretty nifty, too... What do you plan to do in the post-energy economy?
Sunday, April 8, 2012
How managed is your democracy?
The recent elections in Russia have gotten me thinking about the issue of managed democracy. Western academics, governments, and the media have focused their attention on some of the undemocratic processes at work in Russia. I'm certainly not a supporter of many of the electoral tricks and tactics that have been used by Putin and his allies over the course of the past decade. However, all the polling data that I've seen suggests that Putin's policies remain popular, and align with the wishes of a majority of the population. The educated, technologically-engaged younger generation in the larger cities is apt not to approve of these policies and practices, but this group is a vocal minority, rather than a repressed majority.
I would be very curious to see an indexed set of data that charts out the relationship between the expressed views and values of the population and the actual policies put in place by the governments of the world's democracies. I'd like to organize data on political, economic, social/cultural, and diplomatic affairs on a nation-by-nation basis, and chart out the degree to which each democratic government actually enacts the policies that a majority of citizens support. I'd also prepare the same data for several key non-democratic governments as well. I'd then chart the degree of divergence between the weighted views, values, and priorities of the electorate and the actual policies of the government, perhaps also factoring in areas where a party or candidate openly held an unpopular position (since voters could, presumably, take such vies into account, and thus register stronger preferences through voting than the opinions expressed in polls).
My strong suspicion is that the index of democracy management would begin climbing around 1980 in most western nations, and that it would now be quite high in western Europe, the United States, and Canada. I also suspect that several non-democratic nations, including, most significantly, China, as well as Russia, now do a better job of enacting the policies that their citizens value than do the democracies of western Europe and North America. I've read a fair bit of the polling conducted by the Pew Charitable trusts, such as an analysis of the views of ordinary Americans on whether debt reduction or job growth should be prioritized, where the congress and President both basically ignored the priorities of the electorate
I'm concerned that democracy, particularly in the United States, is managed on several different levels. Our electoral structures are archaic and intentionally un-democratic, with the electoral college, gerrymandered district boundaries and similar problems. The two main political parties have a near-stranglehold on politics, and each makes use of a series of undemocratic practices in order to ensure that acceptable candidates are put forward. Money influences the course of our elections in a myriad of ways, and gives far more weight to the votes of those who have it than to those of people who do not. The system of checks and balances also serves to limit governmental responsiveness to the will of the people, as it was always meant to do. Finally, the media serves as a filter, and press coverage further distorts electoral reality. All-in-all, I think that we'd best be very, very careful when accusing Russia of failing to hold free and fair elections...
I would be very curious to see an indexed set of data that charts out the relationship between the expressed views and values of the population and the actual policies put in place by the governments of the world's democracies. I'd like to organize data on political, economic, social/cultural, and diplomatic affairs on a nation-by-nation basis, and chart out the degree to which each democratic government actually enacts the policies that a majority of citizens support. I'd also prepare the same data for several key non-democratic governments as well. I'd then chart the degree of divergence between the weighted views, values, and priorities of the electorate and the actual policies of the government, perhaps also factoring in areas where a party or candidate openly held an unpopular position (since voters could, presumably, take such vies into account, and thus register stronger preferences through voting than the opinions expressed in polls).
My strong suspicion is that the index of democracy management would begin climbing around 1980 in most western nations, and that it would now be quite high in western Europe, the United States, and Canada. I also suspect that several non-democratic nations, including, most significantly, China, as well as Russia, now do a better job of enacting the policies that their citizens value than do the democracies of western Europe and North America. I've read a fair bit of the polling conducted by the Pew Charitable trusts, such as an analysis of the views of ordinary Americans on whether debt reduction or job growth should be prioritized, where the congress and President both basically ignored the priorities of the electorate
I'm concerned that democracy, particularly in the United States, is managed on several different levels. Our electoral structures are archaic and intentionally un-democratic, with the electoral college, gerrymandered district boundaries and similar problems. The two main political parties have a near-stranglehold on politics, and each makes use of a series of undemocratic practices in order to ensure that acceptable candidates are put forward. Money influences the course of our elections in a myriad of ways, and gives far more weight to the votes of those who have it than to those of people who do not. The system of checks and balances also serves to limit governmental responsiveness to the will of the people, as it was always meant to do. Finally, the media serves as a filter, and press coverage further distorts electoral reality. All-in-all, I think that we'd best be very, very careful when accusing Russia of failing to hold free and fair elections...
Saturday, February 25, 2012
Get medieval on education...
Medieval universities, although structured to some degree by charters and royal or clerical authority, were far less tightly-structured than modern universities. In most cases, the students determined who would be able to give lectures. If a particular scholar had a sufficiently good reputation, students might commission that scholar to teach a class. Terrible teachers did not get many students, and thus received little or no income.
I think that this model of education has a great deal to recommend it. What student would actually want to take a class from a bored senior faculty member going through the motions in order to meet his or her nominal teaching obligations? Leading scholars might attract more students, who would be drawn in by the force of towering academic reputations. Other, younger, scholars might focus entirely on the teaching side of academia, and the best and most interesting among them would attract more students, and attract students willing to pay a premium.
For a system like this to work, two of the functions served by modern universities need to be separated. Currently, universities both instruct and evaluate the quality of instruction. This is a system that's just simply built to fail. Students, faculty members, and administrators all have an incentive to wink, nod, and pretend that education is working just fine, thanks, regardless of what is actually taking place in the classroom. As a result, the grades given by universities are not very good at conveying information, and students are rarely pushed terribly hard in their classes.
I'd be very interested to see the credential-giving and evaluation functions of modern universities spun off into different entities. We do this to some extent already in the professional fields, with medical boards, professional engineering exams, and bar exams. Credentialing agencies, when independent, would have a real interest in making sure that anyone who received their credentials had actually earned them - they'd have no product to sell apart from their ability to accurately convey real knowledge about the skills and training that particular students had accrued.
Students could learn in any way that they wished, whether from books, youtube, ancient meditation techniques or from paid educators. All of the messy structures such as departments, programs, majors, and the like, as well as most of the administrative side of a typical university would no longer be necessary. If students didn't feel that they were getting good educational value for their dollars, they could walk out of class, study using other teachers or resources, and obtain certifications regardless. In this system, students would have an incentive to seek out demanding (but skillful) instructors, since there would be a real advantage to putting in more work. And, in most cases, students would pay far, far less money for an education. (In a typical university, between 1 and 5 students are enough to pay the salary of the professor leading the class... so students could get together with a few friends and each pay 3000 dollars for a very small class, or get together with a larger group of 20 or so colleagues, and pay a few hundred dollars per course. Either would be an improvement, in cost and quality of instruction, over the current system.
I think that this model of education has a great deal to recommend it. What student would actually want to take a class from a bored senior faculty member going through the motions in order to meet his or her nominal teaching obligations? Leading scholars might attract more students, who would be drawn in by the force of towering academic reputations. Other, younger, scholars might focus entirely on the teaching side of academia, and the best and most interesting among them would attract more students, and attract students willing to pay a premium.
For a system like this to work, two of the functions served by modern universities need to be separated. Currently, universities both instruct and evaluate the quality of instruction. This is a system that's just simply built to fail. Students, faculty members, and administrators all have an incentive to wink, nod, and pretend that education is working just fine, thanks, regardless of what is actually taking place in the classroom. As a result, the grades given by universities are not very good at conveying information, and students are rarely pushed terribly hard in their classes.
I'd be very interested to see the credential-giving and evaluation functions of modern universities spun off into different entities. We do this to some extent already in the professional fields, with medical boards, professional engineering exams, and bar exams. Credentialing agencies, when independent, would have a real interest in making sure that anyone who received their credentials had actually earned them - they'd have no product to sell apart from their ability to accurately convey real knowledge about the skills and training that particular students had accrued.
Students could learn in any way that they wished, whether from books, youtube, ancient meditation techniques or from paid educators. All of the messy structures such as departments, programs, majors, and the like, as well as most of the administrative side of a typical university would no longer be necessary. If students didn't feel that they were getting good educational value for their dollars, they could walk out of class, study using other teachers or resources, and obtain certifications regardless. In this system, students would have an incentive to seek out demanding (but skillful) instructors, since there would be a real advantage to putting in more work. And, in most cases, students would pay far, far less money for an education. (In a typical university, between 1 and 5 students are enough to pay the salary of the professor leading the class... so students could get together with a few friends and each pay 3000 dollars for a very small class, or get together with a larger group of 20 or so colleagues, and pay a few hundred dollars per course. Either would be an improvement, in cost and quality of instruction, over the current system.
Sunday, January 15, 2012
The importance of guilds
I've always been dubious about the utility of group work in a classroom setting. In my experience, nobody is ever satisfied with group work assignments. Most groups tend to divide work poorly, and feature personality conflicts. This leaves students cranky, and produces work of uneven quality. Work produced by groups filled with students who make unequal contributions also poses a serious challenge for evaluation.
I think that the structure of mmo guilds might offer some ways to improve collaborative work. Being in the right guild, after all, kept me playing WoW long after I'd grown bored with the game itself, and motivated me to invest a great deal of time in aspects of the game that I never much cared about on their own. A good guild is an association that is durable over time, based on mutual consent and cooperation, civil and pleasant, and composed of individuals with a wide range of skills and talents. All of those characteristics are potentially useful in other forms of social organization.
Consider, as an experiment, student learning groups that are formed on a voluntary basis, and which exist throughout much or all of the educational process. Students might have an option to join a learning guild at some point in their second year on campus, and to remain in one until they graduated. Members of a guild could leave, and could also evict unproductive colleagues. These guilds would have some legal standing in the university community, and would endure beyond the scope of a single class or assignment. This system could be particularly useful in online education, where students lack the social bonds that serve to motivate some learners in ordinary classrooms, and where courses are often much shorter (which makes it significantly harder for students to form effective working groups for a single class).
Alternatively, consider a system in which a pool of employees at a corporation are allowed to form themselves into durable groups to tackle assigned tasks from a queue. Shared success and failure, combined with the ability to exit or to evict unproductive team members could produce better focus on the goals of a particular project, while allowing individuals to make the best use of their own particular skills.
Like good gaming guilds, the better academic guilds or professional guilds would persist over the course of years. Some successful academic guilds might opt to enter the job market collectively, in order to capitalize on their demonstrated ability to work effectively together. Group bonds would drive productivity, in the same way that my desire not to be a burden on a raid led me to spend endless hours farming materials to secure enchantments or other minor gear upgrades so as not to bring shame and sorrow on my guild. The proper structuring of academic goals or professional obligations could even make use of the same tiered difficulty that allowed guilds to learn to work together through ever-more-challenging raid instances.
I think that the structure of mmo guilds might offer some ways to improve collaborative work. Being in the right guild, after all, kept me playing WoW long after I'd grown bored with the game itself, and motivated me to invest a great deal of time in aspects of the game that I never much cared about on their own. A good guild is an association that is durable over time, based on mutual consent and cooperation, civil and pleasant, and composed of individuals with a wide range of skills and talents. All of those characteristics are potentially useful in other forms of social organization.
Consider, as an experiment, student learning groups that are formed on a voluntary basis, and which exist throughout much or all of the educational process. Students might have an option to join a learning guild at some point in their second year on campus, and to remain in one until they graduated. Members of a guild could leave, and could also evict unproductive colleagues. These guilds would have some legal standing in the university community, and would endure beyond the scope of a single class or assignment. This system could be particularly useful in online education, where students lack the social bonds that serve to motivate some learners in ordinary classrooms, and where courses are often much shorter (which makes it significantly harder for students to form effective working groups for a single class).
Alternatively, consider a system in which a pool of employees at a corporation are allowed to form themselves into durable groups to tackle assigned tasks from a queue. Shared success and failure, combined with the ability to exit or to evict unproductive team members could produce better focus on the goals of a particular project, while allowing individuals to make the best use of their own particular skills.
Like good gaming guilds, the better academic guilds or professional guilds would persist over the course of years. Some successful academic guilds might opt to enter the job market collectively, in order to capitalize on their demonstrated ability to work effectively together. Group bonds would drive productivity, in the same way that my desire not to be a burden on a raid led me to spend endless hours farming materials to secure enchantments or other minor gear upgrades so as not to bring shame and sorrow on my guild. The proper structuring of academic goals or professional obligations could even make use of the same tiered difficulty that allowed guilds to learn to work together through ever-more-challenging raid instances.
Thursday, January 5, 2012
building a better exocortex
Ever since I read Charlie Stross' "Accelerando", I've been intrigued by the idea of the exocortex. He presents the idea (which I've met elsewhere, as well), that our cognitive processes are becoming ever-more-closely bound together with technology. I feel that I'm part of a transitional generation. My phone remembers my contacts for me, but I am, in fact, able to read a map on my own, and don't generally make use of Google's canned directions. I use Wikipedia to check up on dates if I can't recall them right away, but I personally remember a good deal of basic information. My students are more apt to rely entirely on technology to provide factual information and to handle certain basic tasks. I suspect that this is inevitable, but I do think that the process of integrating human intelligence and machine intelligence needs to be managed more carefully and thoughtfully than is currently the case.
I worry that we have become too reliant on the internet, and particularly on black-box search engines, to provide indexing for information about the world. If Google refuses to list a website, does that website really exist? If Google removed every reference to Alexander II, for some reason, most of my students would have no way of knowing that anything was amiss. Google isn't likely to delete Alexander II from the historical record, but Google and other search engines do actively manage the presentation of data. This is a point of some concern for me.
If it were feasible, I would gladly pay a reasonable amount of money to have access to a fully-transparent and customizable search engine. Search engines (as well as social networking sites) are so crucial in managing our digital lives that it seems very strange to me that we allow them to operate as ad-supported black boxes. This isn't exactly the same thing as having an outsourced and ad-supported corpus callosum, but it's not unrelated. Our connections to external information are critical for shaping our understanding of the world, and for determining how we lead our economic, social, and political lives. Google and Facebook currently make only crude use of their ability to steer our attention and shape our thought processes, but this will inevitably change, as newer tools and algorithms become ever-more-efficient in integrating the internet into our normal processes of cognition, and as transitional dinosaurs of Gen X and before become a smaller and smaller part of the population.
I'm hoping for a fee-for-use search engine sooner rather than later. DMOZ partially address this issue, and I'm intrigued by its crowd-sourced structure, but I'd still much prefer a fully-customizable engine that allowed me personally to intentionally prioritize or ignore certain websites or categories of information, and to manage the data structures used to organize and present information. As more and more of my cognitive processes are linked, in one way or another, to the internet, I'm more and more interested in the form, structure, and biases of the software that forms those linkages.
I worry that we have become too reliant on the internet, and particularly on black-box search engines, to provide indexing for information about the world. If Google refuses to list a website, does that website really exist? If Google removed every reference to Alexander II, for some reason, most of my students would have no way of knowing that anything was amiss. Google isn't likely to delete Alexander II from the historical record, but Google and other search engines do actively manage the presentation of data. This is a point of some concern for me.
If it were feasible, I would gladly pay a reasonable amount of money to have access to a fully-transparent and customizable search engine. Search engines (as well as social networking sites) are so crucial in managing our digital lives that it seems very strange to me that we allow them to operate as ad-supported black boxes. This isn't exactly the same thing as having an outsourced and ad-supported corpus callosum, but it's not unrelated. Our connections to external information are critical for shaping our understanding of the world, and for determining how we lead our economic, social, and political lives. Google and Facebook currently make only crude use of their ability to steer our attention and shape our thought processes, but this will inevitably change, as newer tools and algorithms become ever-more-efficient in integrating the internet into our normal processes of cognition, and as transitional dinosaurs of Gen X and before become a smaller and smaller part of the population.
I'm hoping for a fee-for-use search engine sooner rather than later. DMOZ partially address this issue, and I'm intrigued by its crowd-sourced structure, but I'd still much prefer a fully-customizable engine that allowed me personally to intentionally prioritize or ignore certain websites or categories of information, and to manage the data structures used to organize and present information. As more and more of my cognitive processes are linked, in one way or another, to the internet, I'm more and more interested in the form, structure, and biases of the software that forms those linkages.
Sunday, January 1, 2012
Corporate Apprenticeships?
So, after registering this blog some number of months ago, I'm finally opting to post something.
I've been thinking a good deal lately about the relationship between a college education and employment in the modern American economy. It seems that employers might want college-educated employees for one of two reasons. The specific skills acquired during college might be important, and a degree from a reputable institution provides some assurance to an employer that, say, a mechanical engineer knows a good bit about mechanical engineering. In other cases, a college degree might, instead, let an employer know that a student acquired (or at least demonstrated) a certain set of intellectual skills - the ability to communicate clearly, to craft a sound argument, and so forth.
The second variety of college education is very difficult to replicate without attending college, and, I'd argue, retains its value in almost any field of human endeavor. The utility of college degrees as a means of determining whether or not a particular student has actually achieved such a level of skill has decreased over past decades, as for-profit schools have moved into the market and disguised themselves as real universities, and as grade inflation and the need to retain students have led many other schools to reduce workloads and raise grades and graduation rates.
The first variety of college education, however, is the area where I think that our current system serves students least well. The vast majority of students in technical fields are bound for work in the corporate world or in other large enterprises. The organizations that will hire these young scientists, programmers, and engineers have very specific needs. Perhaps a firm needs database wranglers, or a research laboratory needs another scientist to help investigate certain biochemical processes. Currently, most of these positions are filled by men and women who study in programs little changed since the time of Isaac Newton, save for a slight narrowing of focus.
Instead of training engineers or programmers who are skilled in many, many different areas, but who will use only a tiny fraction of their skills, why not, instead, train most technical workers as modern-day apprentices. Corporations could hire young people right out of high school, and offer on-the-job training designed to teach exactly the skills needed for specific jobs. The initial investment in these training programs would be far less than the cost of a conventional college education, in both money and time, as corporations would not support vast and archaic university bureaucracies. Students would have a real incentive to perform well, because, instead of grades, they would be receiving employment. Corporations, in turn, would have employees with precisely the skills needed, and could train such apprentices over the course of, perhaps, 1-3 years, while still gaining from the work that these apprentice workers performed. In essence, these positions would be built on the current internship programs. Apprentices would be required to sign contracts obliging them to work for a set number of years, thus ensuring a reasonable return on corporate investment. Corporations, in turn, would have no obligation to their apprentices beyond the sunk costs involved in training them, and would still, therefor, have the freedom to modify the structure of their workforces as economic circumstances required.
I've been thinking a good deal lately about the relationship between a college education and employment in the modern American economy. It seems that employers might want college-educated employees for one of two reasons. The specific skills acquired during college might be important, and a degree from a reputable institution provides some assurance to an employer that, say, a mechanical engineer knows a good bit about mechanical engineering. In other cases, a college degree might, instead, let an employer know that a student acquired (or at least demonstrated) a certain set of intellectual skills - the ability to communicate clearly, to craft a sound argument, and so forth.
The second variety of college education is very difficult to replicate without attending college, and, I'd argue, retains its value in almost any field of human endeavor. The utility of college degrees as a means of determining whether or not a particular student has actually achieved such a level of skill has decreased over past decades, as for-profit schools have moved into the market and disguised themselves as real universities, and as grade inflation and the need to retain students have led many other schools to reduce workloads and raise grades and graduation rates.
The first variety of college education, however, is the area where I think that our current system serves students least well. The vast majority of students in technical fields are bound for work in the corporate world or in other large enterprises. The organizations that will hire these young scientists, programmers, and engineers have very specific needs. Perhaps a firm needs database wranglers, or a research laboratory needs another scientist to help investigate certain biochemical processes. Currently, most of these positions are filled by men and women who study in programs little changed since the time of Isaac Newton, save for a slight narrowing of focus.
Instead of training engineers or programmers who are skilled in many, many different areas, but who will use only a tiny fraction of their skills, why not, instead, train most technical workers as modern-day apprentices. Corporations could hire young people right out of high school, and offer on-the-job training designed to teach exactly the skills needed for specific jobs. The initial investment in these training programs would be far less than the cost of a conventional college education, in both money and time, as corporations would not support vast and archaic university bureaucracies. Students would have a real incentive to perform well, because, instead of grades, they would be receiving employment. Corporations, in turn, would have employees with precisely the skills needed, and could train such apprentices over the course of, perhaps, 1-3 years, while still gaining from the work that these apprentice workers performed. In essence, these positions would be built on the current internship programs. Apprentices would be required to sign contracts obliging them to work for a set number of years, thus ensuring a reasonable return on corporate investment. Corporations, in turn, would have no obligation to their apprentices beyond the sunk costs involved in training them, and would still, therefor, have the freedom to modify the structure of their workforces as economic circumstances required.
Subscribe to:
Comments (Atom)
