Associated Press jumps the shark

Everybody has been saying that the Internet is killing journalism, but I see it as more of a suicide.

The argument for lowercasing Internet is that it has become wholly generic, like electricity and the telephone. It never was trademarked and is not based on any proper noun,” Tom Kent, AP Standards Editor, said in a statement. “The best reason for capitalizing it in the past may have been that the term was new. At one point, we understand, ‘Phonograph’ was capitalized.”

Mr. Kent has become an arbiter of journalistic composition despite being apparently unfamiliar with the concept called “research.” Which explains the death of journalism better than the Internet, or indeed any number of internets.

Don’t be a .local yokel

Wikipedia has a nice technical write up that explains why you should never, ever use the .local suffix the way Microsoft has frequently recommended.

But I like this politically incorrect version better:

Microsoft: “Gee, nobody is using the .local piece of the globally shared Internet namespace, so let’s tell all our customers that it’s best practice to use it for our totally super cool version of Kerberized LDAP service called Active Directory!”

Novell: “Oh noes, Microsoft has made an inferior competitor to our flagship technology! It’ll probably destroy our market advantage just like their inferior networking stack did!”

Linux/Unix: “Oh noes, when somebody attaches the new Microsoft technology to an existing mature standards-based network, Kerberos breaks!”

Microsoft: “HA HA HA HA HA HA HA we are totally following the standard, lusers!”

Linux/Unix: “grumble whine we will patch Kerberos even though we don’t agree.”

Microsoft: “whatevs. Did you notice we broke your DNS too? :)”

Apple: “Hey, IETF, we have this cool new zeroconf technology. We want to reserve the .local namespace for it.”

IETF: “OK, sure, you’ve filled out all the forms and attended all the meetings and there’s two independent implementations so you’ve done everything correctly. We have no valid reason to deny this allocation.”

Novell: “Hey, we were using SLP already, what did you just do?”

Apple: “Oh, whoopsie, did we just eat your lunch? HA HA HA HA HA”

Microsoft: “Hey, what just happened?”

Apple: “HA HA HA HA HA HA HA HA HA HA HA RFC6762, lusers!”

Linux/Unix: “grumble mumble whatevs. We can do mDNS.”

Microsoft customers: “OH NOES WE ARE SCREWZ0RRED”

Microsoft: “Meh, you didn’t really want Apple products on your networks anyway.”

:TEN YEARS LATER:

Microsoft customers: “How much would it cost to fix this network?”

Microsoft: “What, were you talking to us? Everything’s fine here. Windows 10 forever!”

Firefox annoyance #5: redirect caching

Firefox Annoyances:

1) Sync
2) pocket
3) hello
4) everything else, other than the plug-in API itself, that isn’t a paper-thin shell around gecko
5) 301 redirect caching

To clear the 301 redirect cache for a single page, go to the “View” menu and light up the “History” sidebar (yeah, of course you forgot about that, nobody uses it), find the site you’re working on, right-click and select “forget about this site”.

annoying git

I’ve been installing git on some corporate servers with the idea of converting existing CVS and ad-hoc code management systems into something reasonably fast and modern.

It’s been somewhat tedious and painful, but supposedly once I’m done the installation will be stable and maintainable. For an enterprise SCM that’s a lot more important than ease of installation, at least in theory. (I ran OpenLDAP for a decade or more, so I can appreciate the value of putting all the pain up front.)

Today’s annoyance is that the gitolite documentation and web site refer to a “hosting user” but the toolset and other web sites describing gitolite installation talk about an “admin user”. After wasting several hours with Google trying to find out exactly what the difference was, I created a new user account for the admin user and executed the commands – at which point it became immediately obvious that THOSE ARE THE SAME DAMN THING.

Curse you, gitolite. I WANTED US TO BE FRIENDS.

The User IS the Enemy

Excellent, often tongue-in-cheek pdf by Stuart Schechter about the unique problem of designing computers that have to function around children.

The User IS the Enemy, and (S)he Keeps Reaching for that Bright Shiny Power Button!

Traceroute vs Tracert

Van Jacobsen’s traceroute utility is not the same thing as Windows tracert, and the MS-Windows tool is probably more academically correct. The GNU version of traceroute that is included with most linux and BSD operating systems can do both kinds of tracing, but does the Van Jake by default (use traceroute -I to get the windows-style ICMP trace).

People have occasionally given routers silly names to produce amusing traces.

I can have a page named NUL in linux, though.

Excellent article, but he forgot my favorite, CLOCK$. I used to have a web page with a big, shiny red button linked to <A HREF=”c:\clock$\clock$”> and the message “don’t click the button or your computer will be destroyed and all your files deleted”. It didn’t really do that, but it would instantly crash any Microsoft system prior to Win98SE or thereabouts. People did click on it, which still kind of amazes me.

Email a tree

The city of Melbourne gave their trees email addresses. People have been writing them love letters.

RMS is online

Richard Stallman finally figured out a way to get online that was ideologically acceptable.

…he now connects to websites from his own computer – via Tor and using a free software browser. Previously, he used a complicated workaround to more or less email webpages to himself. The announcement brought a surprised gasp and a round of applause from the 300-plus attendees.

“At one point, I used to believe that the Firefox trademark license was incompatible with free software, I found out I was mistaken – it does allow the redistribution of unmodified copies,” he said.

Terminology: routes and gateways

Originally, back when the ARPAnet merged with SRI, BBN, NSFnet and MERIT to become the Internet, and dinosaurs still roamed the earth, there was no such thing as a “network router”. How can that be? Meh, it’s just semantics. The terminology has evolved.

Internet-connected systems that routed traffic (which was most of them, back in the day) usually ran a program called “gated” (that’s the GATEway Daemon, written at MERIT) that routed IP traffic between networks. A lot of those oldtimey networks were connected by UUCP dial-up links that were only live between 11pm and midnight to save money, so the code was written to support poor quality network links that came and went somewhat randomly.

Any physical network connection that would accept packets bound for some remote network was called a gateway. Gateways were defined by their network addresses. A data structure was created to hold information about which gateways led to which networks – this is called the routing table. The individual entries in that table are created by specifying a set of target IP addresses (using a network address and a mask), a target gateway, and which physical connection to use to reach that target gateway. That terminology is still in use in some commands, such as the “route” command. The individual routing table entries quickly came to be called routes.

At some point somebody at Stanford or MIT came up with the concept of the default gateway. This was a hack, that has become a crucially important networking concept today. No matter what kind of OS they were running, network-connected computers already had routing tables that held networks, masks, and gateways – so a special “fake network” was defined for the purpose of putting a default gateway into the existing tables. It has an address/mask pair that makes no sense at all – 0.0.0.0/0.0.0.0 – this is intentional, so the fake network entry can’t possibly interfere with any real networks.

The network stacks of all modern systems (post 1979) will look for a route to a target address, and if they don’t find one, they will use the route defined by the 0.0.0.0/0.0.0.0 routing table entry. It’s a wild swing, the hail mary pass, you just throw it out there and hope for the best.

Since the default route fits the format that is used for all other routes (it just has an impossible ip/netmask pair) it can be carried on any dynamic routing protocol – BGP, EIGRP, OSPF, RIPv2, you name it. This usually causes more problems than it’s worth, so most places do not distribute default routes dynamically. Instead they are configured by DHCP or defined manually, and cannot fluctuate.

Anyway, today, individual people have their own computers, instead of sharing a computer with 500 other people using dumb terminals, so most of our hosts don’t route, so their routing tables are almost empty. They will typically have two entries:

1) the default route, still called the default gateway in many implementations
2) the route to the local net, which is specified by the host’s IP address and mask, and uses the physical ethernet port as the gateway.

A host that has no default route can only talk to machines on networks for which it holds specific routes.

Multicast-capable hosts (like linux and Windows machines) may also have multicast routes in their routing tables, but that is something you usually only see on servers at this point. It will become more common on end user desktops in the future, though; MacOSX and Ubuntu already have multicast capabilities turned on from the factory.

So today any network-capable widget might have static routes, defined by the system administrators, and those static routes might include a default route. It might also have dynamic routes, learned by communicating over the network with other systems, and those dynamic routes might include a default route. You can still call the target of the default route the default gateway if you wish, or you can call it the default route’s next hop, but most networking pros will just say default route or default gateway interchangeably. We’re a little sloppy with the language.

Oddly, over time computers have become less and less capable of dealing with multiple default routes. The pre-v2 linux kernels handled it effortlessly, but modern linux is just as bad in this respect as Windows.

Language evolves, although not always for the better. I personally have found it advantageous to adopt or at least be fluent in the terms and notations used by the youngest generation of technologists. I try to say folder instead of directory, for instance, because directory now means a backend database accessed by LDAP, instead of an on-disk filesystem data structure. I insist on using only international date notation. And I would like to train myself to pronounce router the same as rooter – which is almost certainly going to be the standard pronunciation before I manage to retire – but I haven’t got that programmed into my wetware yet. And I try to always say route instead of gateway whenever possible. The only time I want to use the word gateway is when I’m specifically talking about the target of a route. It’s not that the term is wrong in all other contexts, it’s just that it’s somewhat sloppy and very old-fashioned; it’s like calling your car a flivver instead of a beater.

inherent vs inherited difficulty

Quote

There's always something better just out of sight

“There is nothing so easy but that it becomes difficult when you do it reluctantly.” — Publius Terentius Afer

Jimmy Wales was born with a tail.

Although the Wikipedia founder’s extra appendage was removed shortly before his first birthday, it was reputedly prehensile and covered with short silky fur. I know this is true because I read it on Wikipedia!

Concerning Nature’s “open access”

Earlier this week the Intartubes were boiling with the news that Nature Magazine would open its archives back to 1869. Which would, indeed, be marvelous and unexpected.

But it’s a little more complicated than that… it seems Nature’s publisher, Macmillan, is going to let paid Nature subscribers use (yet another) foredoomed-to-failure “read only sharable format”.

The content-sharing policy, which also applies to 48 other journals in Macmillan’s Nature Publishing Group (NPG) division, including Nature Genetics, Nature Medicine and Nature Physics, marks an attempt to let scientists freely read and share articles while preserving NPG’s primary source of income — the subscription fees libraries and individuals pay to gain access to articles.

That sounds pretty great for everybody, right? Win-win!

ReadCube, a software platform similar to Apple’s iTunes, will be used to host and display read-only versions of the articles’ PDFs. If the initiative becomes popular, it may also boost the prospects of the ReadCube platform, in which Macmillan has a majority investment.

Starting to sound a lot dodgier now… we may have a reality disconnect going on…

Although the screen-view PDF cannot be printed, it can be annotated — which the publisher says will provide a way for scientists to collaborate by sharing their comments on manuscripts.

Yep, reality check sorely needed. Hey, look, smartphones have cameras!

ReadCube -> monitor screen -> camera phone -> email -> PC -> printer.

There are no formats that can be viewed but not printed. If you think such a thing exists, everything you’ve built is suspect, because you’re apparently not entirely aware of what’s going on around you. The odds are good that Macmillan’s “read only format” can be trivially defeated, and that script kiddy hacks will be available in short order.

Any questions?

Accurately named network distortion tool

Comcast is a tool designed to simulate common network problems like latency, bandwidth restrictions, and dropped/reordered/corrupted packets.”

https://github.com/tylertreat/Comcast

Office not so 365

Microsoft’s Azure Cloud service failed at almost exactly midnight last night, taking down hundreds of websites who may have thought that hardware redundancy could magically protect them from sysadmin oopses, as well as users of Xbox live and Microsoft’s flagship service Office 365.

Viva Zorggroep, a Dutch healthcare organisation with 4,000 employees, said it had also been affected as a consequence of adopting Microsoft’s online apps.

“At this time, our supporting departments such as finance, HR, education, IT et cetera are working with Office 365,” said Dave Thijssen, an IT manager at the company.

“This morning these servers were unresponsive, which means users were not able to log in to Office 365.

“As a result they had no access to email, calendars, or – most importantly – their documents and Office Online applications.

“We also had trouble reporting the outage to our users as most of digital communication – email, Lync, intranet/Sharepoint – was out.

The outage persisted for over five hours for some customers and apparently there are still latency issues at this time. This is of course a violation of the Service Level Agreement… so you can keep a nickel or two of your monthly rent, I bet.

XKCD Philae coverage as a flipbook

See here for some find out address generico levitra on line for a decrease in sexual desire. Harris, the Senior Director for African Affairs buy cheap levitra midwayfire.com for the National side. Systemic inflammation induces oxidative stress, damages cellular order generic cialis membrane and DNA, and results in widespread tissue injury. Someone told me that you have to kiss a order levitra lot of confusion among prospective buyers as to which brand of medication to go for, which one is the most painful. nice work by Mark Karpeles.

Internet soft spots

Want to build a ginormous botnet without doing a lot of work? Compromise one of the Internet’s soft spots.

If you take over bOINGbOING.net, you can use the site to inject malware in 1.3 million visitors. Chump change! How about TheChive.com, or Kottke.org, or whatever? Face it, you’re not going to get more than 15 million suckers. It’s just too much effort for a lazy man; you’d still be doing a lot of hard work to recruit a paltry few million zombies.

So, you take over jquery.com, or typekit.com. Now you’re cooking with gas! It’s become common practice for websites to use remotely sourced scripts – so there are thousands of sites that will blindly push out whatever is in the file jquery.js at jquery.com, and all that site’s visitors will run it just as blindly. So if you take over a popular script or advertisement source, you can leverage that into billions of individual attacks, quite easily.

And that’s my Halloween horror story for this year.

Putting it on a computer doesn’t make it new.

This Ars Technica article is notable not only because it explains the Alice decision, but because it leads with a picture of a Wang System 2200 terminal. I taught myself BASIC on one of these around 1977 or so (before the Black Ships came and the secret of hose gartering that doesn’t ravel was lost).

Phone Scammer Slammer

Revenge!

“Man Ass”

Unix-derived operating systems have a tradition of making commands short and easily typed regardless of social conventions.

So, in order to consult the manual page for the Autonomous System Scanner, you would type “man ass” at the command line. People involved with AS work would not find this remarkably odd or offensive – we’ve already got jobs to do, that don’t involve complaining about other people’s sense of propriety.

However, if one creates a site that automatically generates HTML-formatted web pages from the man pages of the Ubuntu V13.04 linux distribution, popularly called Raring Ringtail, one ends up hosting a page describing “raring man ass”.

The Internet being what it is, such a page may have unexpected effects on your google analytics results…

Redstone Rockets

I enjoy reading John Bullard’s History of the Redstone Missile System, although most people are likely to find it pretty dry. I found it linked from Jim Ryan’s marvelously informative site, which is a memoir of his Army experiences manning the Army’s Redstone missiles from 1958 to 1962. It’s a wonderful site to visit if you’re a hardcore rocket buff or cold war historian, although perhaps not much fun for those who couldn’t keep themselves awake in history class.

I think sites like Jim’s are the best thing about the World Wide Web. Computer professionals didn’t need the WWWeb to communicate with each other and organizations didn’t need the Web to move data – those needs were already met by the Internet itself, underlying the Web. But the Web lets people like Jim reach out to the whole world, not just computer gurus, with information that would never otherwise be available to many of the people most interested in it.