Monthly Archives: June 2014

Federated Wikis – A Use Case

Mike Caulfield has been writing up a storm on federated wikis,  a tool where users maintain their own site , then copy/fork individual pages they want to keep or edit from other sites in their federation. Today Mike, Tony Hirst, and Boll Fitzgerald had an energetic discussion about when to fork and when not to fork in a federated wiki (particularly in Ward Cunningham’s Smallest Federated Wiki , henceforth referred to as SFW, the federated wiki sandbox of the moment)

I think there’s a semantic issue here.  In a software context, you fork because you want to take the project a different direction, as opposed to submitting patches to the existing codebase.  In a SFW context, a given wiki belongs to a particular individual, and only that individual can edit pages within that wiki.  Clicking the fork button does two things

  1. Makes a copy of the page with the same name that you can edit as part of your wiki instance
  2. Starts tracking changes on the original

Even if you don’t intend to make changes, you may fork a page in order to have a local copy. As Mike points out, this is in and of itself a good thing.

Some of this concern about forking and changes stems from conceptualizing SFW as a publishing platform. Maybe this isn’t the right concept.  Instead, I imagine a sort of public notebook that it’s very easy to copy from.  Since it’s mine, I can keep any page in the state I want it, but anyone else can grab what they want while maintining at least some sourcing/changelog.  In the discussion this morning, Mike mentioned that versioning a single document pushes a group of authors towards consensus, presumably since the system requires that one ends up with one document.  Federated wikis show what can happen when that constraint is removed and people can create together individually.  We’ve never really had that kind of tool. before. what might you do with it?

Mike’s demo SFW project, The Hidden History of Online Learning has been a fascinating introduction to the platform.  The federation allows users to create and organize in whatever way makes the most sense to them and fork from others only what interests them.  What if you took it another step.

Your wiki is your central learning repository.  It allows access controls so you have public (your portfolio), restricted access (group collaboration) and private (your thesis first draft) pages,  Into these pages you can drag all sorts of content .  For that matter, you might be able to set permissions on each JSON object for very granular access control.  You might then export content to a publication platform like a blog when it’s in a final form.

Hiding in Plain Sight

This week has been full of ‘hidden history’. Last week, Mike Caulfield launched the Hidden History of Online Learning, a federated wiki project. Today, Audrey Watters presented on the Hidden History of Ed-Tech at CETIS. Both projects start from a supposition that popular fascination with high profile elearning and ed-tech projects like Khan Academy and LMS’s have pushed to the background individuals, technologies, and ideas that tended toward the progressive and decentralized.

A regular response to this seems to be surprise.  How did progressive and decentralized education get so marginalized?  Audrey gives an important clue.

AllLearn, short for the Alliance for Lifelong Learning, stressed that its classes were just that: an opportunity for continuing education and lifelong learning. Udacity stresses something different today: it’s about “advancing your career.” It’s about “dream jobs.” 

There’s been plenty of hype about these new online platforms displacing or replacing face-to-face education, and part of that does connect to another powerful (political) narrative — that universities do not adequately equip students with “21st century skills” that employers will increasingly demand. But by most accounts, those who sign up for these courses still fall into the “lifelong learner” category. That is, the majority have a college degree already.

Centralization and control are a logical path forward if the whole higher ed process is first and foremost on of capital development.  The task of equipping learners with marketable skills invites the economy of scale that the xMOOC movement has at its core.The rising costs of postsecondary education push the entire sector towards an economic justification and orientation, and ed-tech follows.  The work of Papert,   Ted Nelson, and Jim Groom reminds us that there is another way, just as the work of Maria Montessori and John Dewey reminded us there is another way.

There’s another way.  We know this.  We know what it is.  Why don’t we choose it?

We choose what we have because we believe it gets us what we want, economic advantage at low(er) cost.  This week Udacity announced nanodegrees, just enough MOOC certificates to get the promotion. If you were asked to create a system whose primary purpose was to allow learners to get a leg up in the rat race without burying themselves in debt, it would look a lot like the current collection of MOOC silos.

Maria Anderson quoted the following at Learning Analytics 12:

(If I find out the source, I’ll update this)

There’s the crux.  We — society — have the education system that we want: one that values economic efficiency and human capital.  The things we don’t like about institutional structures, the LMS, or online course design are a symptom of that emphasis.

Nugget Post: Man-Computer Symbiosis

 It is often said that programming for a computing machine forces one to think clearly, that it disciplines the thought process. If the user can think his problem through in advance, symbiotic association with a computing machine is not necessary.

If this is so, why has the last half century of technological advancement been about reducing the need for such clear thinking? From search engines, to WYSIWYG blogging platforms like the one I’m using now, to IDE’s, to Siri,the whole notion of human-computer symbiosis shifts more and more of the “detail work” from the human to the computer.  Licklider believes this to be a good thing.  I’m not quite as convinced.

For a very long time, disciplined thinking, even more than knowledge recall, has been  a mark, perhaps even the mark. of an educated mind.  That’s why we still teach formal logic, at least to some.  There is I think a perception that rigorous thinking belongs to the scientific culture (to reference C.P. Snow).  After all, if you don’t engineer rigorously, your software fails, your gadget breaks, and your building falls down.  The consequences of non-rigorous thinking about Marxist themes in Game of Thrones are arguably lower,

Licklider seems to argue that having the computers do the detail work frees up human creativity.  I compare his discussion of hours making plots to what can be done with a few lines of R.and a few SQL queries.  I see two hazards that come with this freedom:

  1.  The freedom to think in broad strokes gets us collectively out of the habit of careful thinking,  Then when we need that habit again, it must be, to some extent, relearned,  Let;s just hope that relearning doesn’t happen mid-crisis.

  2. In the same vein, what happens if, as in E.M. Forster’s eponymous story, the machine stops.  In the first episode of Connections, James Burke describes how the New York City blackout of 1977 was so catastrophic because few knew how to work around widespread technological failure, even in the short term.

We do amazing things every day, but it sits atop a massive infrastructure which is frighteningly fragile.  Are we one solar flare from it all falling about our ears?

As We May Think: The Nugget of Negative Space and Associations which Spring Thence

I suspect I am not supposed to do this, but what grabbed me in the Bush “As We May Think” was what wasn’t there.  Bush writes in great detail about how information can be gathered and associations built, but presumes that most of this information will have as its source the usual suspects (encyclopedias, printed periodicals, etc.)  He is almost silent on the notion of the end user as content creator. This is the blind spot  of the piece.

It’s not surprising.  Futurists do a good job of predicting things which are a logical outgrowth of existing techne, but have a poor record with things which require conceptual breakthroughs. Jules Verne describes fax machine like devices in Paris au XXe siècle , but misses radio completely.Interestingly , his description of an education system which eschews the classics for the practical arts and (what we’d now call) STEM  was pretty much spot on.

Blogs and social media and the like now allow us to think in public.  This is a notion all but unheard of a generation ago. It will be interesting the extent to which the major reading do or don’t address this notion of public thought.  Particularly , I mean putting in public thoughts which haven’t yet been carefully edited/censored.  Prior to the web, if you read someone’s thoughts, they were (most of the time)  already in an edited form that had been through several drafts..  Stephen Krashen, writing about language acquisition , describes an internal “monitor” that learners use to check their language output for errors and correct it.  The monitor is much more active when someone is writing than when they are speaking, since there’s not time to check individual spoken utterances without awkward pauses.

I wonder if internet writing has the effect of turning down all of our internal editors, and what that means for the future of written expression.

How does it feel when I think?

I first of all recognize that I am quite “late to the party”.  The opening days of the course happened to coincide with a planned vacation. Since I’m an open participant, I’m going to, much to the chagrin of those in charge, I’m sure, view the deadlines as suggestions.

Most of the time, I think very verbally, at least when I’m thinking intentionally.  I “hear” words in my head as I think.  This isn’t to say my mind never wanders.  Especially when I was younger, I can remember occasions where I would stop myself mid thought and wonder how I had gotten to thinking about ____.  Most of the time I could mentally backtrack through the chain of associations and determine how I’d gotten from point A to point B. When I have thoughts that are not words, they tend to be unbidden.   This will seem particularly unusual when I reveal that I have a degree in music composition.  I am in fact hard pressed to put into words how my musical thought process works.  Maybe it’s a left brain-right brain thing.

More IndieWeb

Earlier, I mentioned the IndieWeb in a post that was at least ostensibly about security. Since then, I’ve done more poking about and have found that this is a fairly well organized movement. The IndieWeb concept actually has several components.

  • Identity – both having one’s own domain and using it, via rel=”me” attributes on links, as a source if identity on the web, (think OpenID – The Next Generation)
  • POSSE (Publish Own Site, Syndicate Elsewhere).  The idea is to have the site you control be where everything is published first.  Then you push that content wherever else you want it to go (particularly silos like Facebook or Twitter),  Replies and comments are pulled back to the original site via webmentions and backlinks.. Known is an  in-development out of the box solution for all of this, but I want to see what I can do with the tools (like WordPress) that I already have deployed.

On a similar front, Mike Caulfield has been posting up a storm about Smallest Federated Wiki and wondering what happens when a wiki becomes an individual repository, linked to other wikis via federation and forking rather than as a shared space.

Let’s face it, the big silos (Twitter, Facebook, G+) aren’t going away anytime soon. The advantages they have over the open web in terms of network effects and discovery will protect their position.  Projects like IndieWebCamp and SFW show how the silos can be routed around.