Sunday, January 15, 2012

The importance of guilds

I've always been dubious about the utility of group work in a classroom setting. In my experience, nobody is ever satisfied with group work assignments. Most groups tend to divide work poorly, and feature personality conflicts. This leaves students cranky, and produces work of uneven quality. Work produced by groups filled with students who make unequal contributions also poses a serious challenge for evaluation.

I think that the structure of mmo guilds might offer some ways to improve collaborative work. Being in the right guild, after all, kept me playing WoW long after I'd grown bored with the game itself, and motivated me to invest a great deal of time in aspects of the game that I never much cared about on their own. A good guild is an association that is durable over time, based on mutual consent and cooperation, civil and pleasant, and composed of individuals with a wide range of skills and talents. All of those characteristics are potentially useful in other forms of social organization.

Consider, as an experiment, student learning groups that are formed on a voluntary basis, and which exist throughout much or all of the educational process. Students might have an option to join a learning guild at some point in their second year on campus, and to remain in one until they graduated. Members of a guild could leave, and could also evict unproductive colleagues. These guilds would have some legal standing in the university community, and would endure beyond the scope of a single class or assignment. This system could be particularly useful in online education, where students lack the social bonds that serve to motivate some learners in ordinary classrooms, and where courses are often much shorter (which makes it significantly harder for students to form effective working groups for a single class).

Alternatively, consider a system in which a pool of employees at a corporation are allowed to form themselves into durable groups to tackle assigned tasks from a queue. Shared success and failure, combined with the ability to exit or to evict unproductive team members could produce better focus on the goals of a particular project, while allowing individuals to make the best use of their own particular skills.

Like good gaming guilds, the better academic guilds or professional guilds would persist over the course of years. Some successful academic guilds might opt to enter the job market collectively, in order to capitalize on their demonstrated ability to work effectively together. Group bonds would drive productivity, in the same way that my desire not to be a burden on a raid led me to spend endless hours farming materials to secure enchantments or other minor gear upgrades so as not to bring shame and sorrow on my guild. The proper structuring of academic goals or professional obligations could even make use of the same tiered difficulty that allowed guilds to learn to work together through ever-more-challenging raid instances.

Thursday, January 5, 2012

building a better exocortex

Ever since I read Charlie Stross' "Accelerando", I've been intrigued by the idea of the exocortex. He presents the idea (which I've met elsewhere, as well), that our cognitive processes are becoming ever-more-closely bound together with technology. I feel that I'm part of a transitional generation. My phone remembers my contacts for me, but I am, in fact, able to read a map on my own, and don't generally make use of Google's canned directions. I use Wikipedia to check up on dates if I can't recall them right away, but I personally remember a good deal of basic information. My students are more apt to rely entirely on technology to provide factual information and to handle certain basic tasks. I suspect that this is inevitable, but I do think that the process of integrating human intelligence and machine intelligence needs to be managed more carefully and thoughtfully than is currently the case.

I worry that we have become too reliant on the internet, and particularly on black-box search engines, to provide indexing for information about the world. If Google refuses to list a website, does that website really exist? If Google removed every reference to Alexander II, for some reason, most of my students would have no way of knowing that anything was amiss. Google isn't likely to delete Alexander II from the historical record, but Google and other search engines do actively manage the presentation of data. This is a point of some concern for me.

If it were feasible, I would gladly pay a reasonable amount of money to have access to a fully-transparent and customizable search engine. Search engines (as well as social networking sites) are so crucial in managing our digital lives that it seems very strange to me that we allow them to operate as ad-supported black boxes. This isn't exactly the same thing as having an outsourced and ad-supported corpus callosum, but it's not unrelated. Our connections to external information are critical for shaping our understanding of the world, and for determining how we lead our economic, social, and political lives. Google and Facebook currently make only crude use of their ability to steer our attention and shape our thought processes, but this will inevitably change, as newer tools and algorithms become ever-more-efficient in integrating the internet into our normal processes of cognition, and as transitional dinosaurs of Gen X and before become a smaller and smaller part of the population.

I'm hoping for a fee-for-use search engine sooner rather than later. DMOZ partially address this issue, and I'm intrigued by its crowd-sourced structure, but I'd still much prefer a fully-customizable engine that allowed me personally to intentionally prioritize or ignore certain websites or categories of information, and to manage the data structures used to organize and present information. As more and more of my cognitive processes are linked, in one way or another, to the internet, I'm more and more interested in the form, structure, and biases of the software that forms those linkages.

Sunday, January 1, 2012

Corporate Apprenticeships?

So, after registering this blog some number of months ago, I'm finally opting to post something.

I've been thinking a good deal lately about the relationship between a college education and employment in the modern American economy. It seems that employers might want college-educated employees for one of two reasons. The specific skills acquired during college might be important, and a degree from a reputable institution provides some assurance to an employer that, say, a mechanical engineer knows a good bit about mechanical engineering. In other cases, a college degree might, instead, let an employer know that a student acquired (or at least demonstrated) a certain set of intellectual skills - the ability to communicate clearly, to craft a sound argument, and so forth.

The second variety of college education is very difficult to replicate without attending college, and, I'd argue, retains its value in almost any field of human endeavor. The utility of college degrees as a means of determining whether or not a particular student has actually achieved such a level of skill has decreased over past decades, as for-profit schools have moved into the market and disguised themselves as real universities, and as grade inflation and the need to retain students have led many other schools to reduce workloads and raise grades and graduation rates.

The first variety of college education, however, is the area where I think that our current system serves students least well. The vast majority of students in technical fields are bound for work in the corporate world or in other large enterprises. The organizations that will hire these young scientists, programmers, and engineers have very specific needs. Perhaps a firm needs database wranglers, or a research laboratory needs another scientist to help investigate certain biochemical processes. Currently, most of these positions are filled by men and women who study in programs little changed since the time of Isaac Newton, save for a slight narrowing of focus.

Instead of training engineers or programmers who are skilled in many, many different areas, but who will use only a tiny fraction of their skills, why not, instead, train most technical workers as modern-day apprentices. Corporations could hire young people right out of high school, and offer on-the-job training designed to teach exactly the skills needed for specific jobs. The initial investment in these training programs would be far less than the cost of a conventional college education, in both money and time, as corporations would not support vast and archaic university bureaucracies. Students would have a real incentive to perform well, because, instead of grades, they would be receiving employment. Corporations, in turn, would have employees with precisely the skills needed, and could train such apprentices over the course of, perhaps, 1-3 years, while still gaining from the work that these apprentice workers performed. In essence, these positions would be built on the current internship programs. Apprentices would be required to sign contracts obliging them to work for a set number of years, thus ensuring a reasonable return on corporate investment. Corporations, in turn, would have no obligation to their apprentices beyond the sunk costs involved in training them, and would still, therefor, have the freedom to modify the structure of their workforces as economic circumstances required.