a newer, securer blog (for the meddling search bots)

Google is practicing social engineering by denigrating "non-secure" sites in search results, according to ars technica:

Sites that properly implement the transport layer security (TLS) protocol may be ranked higher in search results than those that transmit in plaintext, company officials said in a blog post published Wednesday. The move is designed to motivate sites to use HTTPS protections across a wider swath of pages rather than only on login pages or not at all. Sites that continue to deliver pages over unprotected HTTP could see their search ranking usurped by competitors that offer HTTPS.

So, as a genetically modified guinea pig I now have a "secure" version of this blog: https://tommoody.us. Bots, do your thing. Except, well, http://tommoody.us still exists and doesn't redirect anyone to the secure page(s).
It's like I now have a "secure" mirror site -- but it's a mix of http and https URLs.
For example, this post is "secure" (meaning the content is encrypted from the page to your browser and Malcolm in the Middle can't read along with you), but only because I manually changed the image URL of my "logo" to https in my Word Press theme editor. Whereas this post is not "fully secure" because it still has http in the image tag in the post.

Am bleakly curious how the googlebot is going to handle this -- will it index my site twice? Will the http version of the same site be demoted in search results? Guess I need to do some reading (now that I've already taken this step.) I'm sure the geniuses at Google won't make a mess of their altruistic behaviorism.

Update: Google wants you to treat the https site as a new site and redirect all your http URLs to it. That's more work than any mom and pop website that's built a smidgeon of web credibility should have to do just to be on the good side of Google's search bot. The political dimension of this is what's been predicted (and happening) for years -- the gradual downgrading of independent users in favor of larger corporate entities with full-time security staffs. One smug commenter on the ars technica post internalizes this as snobbish complaining about the http users stinking up his internet neighborhood: "I don't want you operating a fly-by-night dynamic web site with no security in the neighborhood my nephews do homework on." But what if a site doesn't use "dynamic" features such as online purchase forms? As I understand it, Google still plans to penalize it for not having https.

Update 2: With a bit of hassle and about an hour of down time I was able to redirect the http pages to the new https site, which you should now be seeing.

shoutbacks to the non-social media web

1. Thanks to Daniel DeLuna for the post of some of my works on paper. To clarify somewhat the accompanying quote he uses, my point wasn't that those tech-website commenters were saying that you had to be high tech. It was that they were making a false analogy between "writing your own code" and "grinding your own pigments." The latter has been a dead issue in the art world since before the 1960s. It's OK if you want to be a purist about writing code for digital-based artworks, just don't use that analogy to make the point.

2. On (Anti)Disambiguation, by Mikhel Proulx (which appears to be a pseudonym) in the journal Doubting (2012). Wow, an essay that mentions Internet Surfing Clubs in the context of the Habermasian public sphere that doesn't make you want to pull your hair out. This is a good, detached summation of that scene, and it's mostly still relevant, two years after its publication, to a kind of "authorless" art still happening online. A quick recap of Proulx's thesis: the trend of dominant culture is to "disambiguate" (for example, Wikipedia's lists of different possible meanings for the same word), while artists "anti-disambiguate" through remixes, mashups, crowdsourcing, etc. Finding a place for this to happen has become more of a problem in the last couple of years (since Proulx's article), with social media hosts insisting on a "unitary identity" and killing the remix vibe by continually tinkering with their platforms -- as was seen by the recent artist embrace and rapid disillusionment with Google Plus. Also, at the time of Nasty Nets, et al, artists didn't have to care about whether all their efforts were making David Karp a very rich man and leaving them in the cyber-slums (i.e., mom's basement). Many don't care now -- but they should, maybe.

3. David Szafranski has some articles up that I wrote about his work -- before I moved back to NY in '95, so we were still in the print era. This one from the Dallas Morning News (1990) came at the tail-end of the "NEA flap," also known as the Culture Wars. This was sort of a proto-Boris Groys argument for the need for institutional empowerment -- so we find out what we actually care about. At the time a friend asked me, "what are you actually saying here?" I think it was an art review disguised as a contrarian political argument.

Update: Mikhel Proulx's anti-disambiguation of his name by linking to "Mikhel Proulx" on Google Images (for his byline on the article "(Anti)Disambiguation") was so successful that I couldn't tell if he was a real person. He is, and also wrote this paper further developing themes in the article, including screenshots of a photoshop filter-ish riff Charles Westerman and I did on a stock photo company's aggressively watermarked image of a woman looking at a late Picabia painting at the Tate (on Nasty Nets).