Monthly Archives: January 2014

Facebook Paper

Paper presents user updates as “stories”: captions overlaid on large-format photos, auto-playing videos, and even long or short text screeds all in an edge-to-edge, full-screen format. The default “section” in the app is the user’s Facebook news feed, but users can pull new sections up from a set of cards, such as “Headlines” or Tech,” and browse between them in one pane.

“Each section includes a rich mix of content from emerging voices and well-known publications,” Facebook says. This gives the biggest clue to the real intended creators for paper: brands, be they news outlets or celebrities.
Ars Technica: Facebook’s Paper is Facebook without the Facebook

Oh, look, yet another proprietary digital publishing platform targeted at publishers. How quaint. Here, let me add this to my pile of fifty or so I have over here.

Applescript: Getting unique items in a list update

I love getting questions about the contents of, or topics related to, my site. Most recently, I was emailed a question about one of the older functions I have in the Applescript section. In particular, it was one for getting unique items in a list. Here’s the function…

on GetUniqueItems(sourceList)
	set itemCount to (get count of items in sourceList)
	set compiledList to {}
	--get the first item to kick off the list
	repeat with x from 1 to itemCount
		set itemFound to false
		set itemX to item x of sourceList
		if x < itemCount then
			repeat with y from (x + 1) to itemCount
				set itemY to item y of sourceList
				if itemY is itemX then set itemFound to true
			end repeat
		else
			repeat with y from 0 to (itemCount - 1)
				set itemY to item y of sourceList
				if itemY is itemX then set itemFound to true
			end repeat
		end if
		if itemFound is false then
			set end of compiledList to itemX
			exit repeat
		end if
	end repeat
	--if no items are found
	if (get count of items in compiledList) is 0 then
		return compiledList
	end if
	--find the rest of the unique items
	repeat with x from 1 to itemCount
		set itemFound to false
		set itemX to item x of sourceList
		set resultCount to (get count of items in compiledList)
		repeat with y from 1 to resultCount
			set itemY to item y of compiledList
			if itemY is itemX then set itemFound to true
		end repeat
		if itemFound is false then set end of compiledList to itemX
	end repeat
	return compiledList
end GetUniqueItems

The question was focused on why I go through the source list more than once. As soon as I saw the function after the question, I knew they were right that something was wrong. My answer essentially explained that this was one of the first useful home-brewed functions I wrote, and since it worked, it stuck, as working code is wont to do. But, honestly, I’ve reviewed this code a dozen times and it has me completely baffled as to how it works. I think there is even a whole block on there that can come out and nothing would change.

I started writing my first Applescripts in 2005, which was also my first serious foray into programming. The last time I thought about if...then statements was in high school writing BASIC for the Commodore 64 in high school. This function, according to my notes, was written in 2007 when my needs and skills were becoming more robust. This function is currently in use in several scripts today with nary an error. But, nine years of experience later and immediately that function is absolutely cringe-worthy (though only to a point considering when I wrote it), so I rewrote it. Et voilà…

on getUniqueItems(src)

	set srcCount to (count src)

	set unq to {}

	repeat with x from 1 to srcCount

		set srcItem to item x of src

		set unqCount to (count unq)
		set match to false

		repeat with y from 1 to unqCount
			set unqItem to item y of unq
			if srcItem = unqItem then
				set match to true
			end if
		end repeat

		if match is false then
			set end of unq to srcItem
		end if

	end repeat

	return unq

end getUniqueItems

Hindsight being 20/20 and all that, this a “duh!” moment. There are a couple important things to note about this.

First, my test data for these types of functions is reliable but small. This is O(n2) on the low-end of things, but almost invariably Applescripts very rarely ever deal with data sets large enough where O(nk) has enough of an impact to get a coffee and sandwich while waiting. My personal experience and preference is that if that were the case, then I need to go find a more appropriate tool for data prep.

Second (and last), this block…

set unqCount to (count unq)
set match to false

repeat with y from 1 to unqCount
	set unqItem to item y of unq
	if srcItem = unqItem then
		set match to true
	end if
end repeat

if match is false then
	set end of unq to srcItem
end if

…could be replaced with this common Applescript hook…

if unq does not contain srcItem then
	set end of unq to srcItem
end if

The only problem with this, as I see it, is when trying to compare custom data types as opposed to core data types. This is great if I only ever worked with Applescript’s core data types, like string, number, date, and the like. But almost all of my Applescript code has been targeted to Adobe’s Creative Cloud, which brings a wealth of custom objects with loads of properties with which to work. I think leaving in the extra code (and any possible hits on speed since this not baked into the language like contains) is reasonable for the sake of easy customization later. By way of example, this…

-- compare memory addresses
if srcItem = unqItem then

…becomes this in a pinch…

-- compare object properties
if foo of srcItem = foo of unqItem then

…or even…

-- deep comparison
if my customCompare(foo of srcItem, foo of unqItem) then

So, a bit of extra code for the win. I suppose I could set up a hash table implementation to improve upon the O(n2)O(nk) range of complexity, but with Applescript work, again, it’s really not worth it.

That was a really great question on a number of levels. Not just that this shows that people actually read the site on occasion and finds something useful, which is the core goal of the site (this blog is really more of just a place to vent that offers me more flexibility than other blogging sites or social media) but also to be compelled to review and improve old code and find just how far I have advanced over the years. Win-win.

The Macintosh is 30.

The Macintosh is 30. Like so many other people I am asking the same question: How the hell did that happen?

I remember the first time I used a Macintosh. I was a junior in high school in 1988 taking a basic drafting and architecture class. In the corner of the classroom was a small bank of Macs all loaded with illustration and CAD programs. Students who achieved a high enough grade got to use those computers for their projects. I was getting an “A” (one of my precious few at the time) and so I scored some much-ballyhooed computer time.

When I sat down in front of the Mac, I had no idea what it was. I don’t think I had even heard of it until I walked into that class. This will likely sound cliche but the whole experience was intuitive right from the start, from creating a new file, drawing the lines and shapes with the mouse to create the machine part plans, to saving the file, and the rest. I was always good at visualizing things, and the Mac was the first computer I had ever used that spoke my internal language. I could use my hand to draw and I was finally able to make a connection between the files I saw on-screen and the data on the disks. I loved using the Mac.

At home, I had a Commodore 64 and all I knew how to do with it was really basic BASIC programming and run games, both bought and pirated. When the 8-bit GEOS operating system was released, my use of the Commodore really took off because all of a sudden it could be used just as easily as the Mac. All of these disparate applications were finally unified in a single interface. I had a joystick as opposed to a mouse—if a mouse existed to use with GEOS I never knew about it—but that didn’t matter. Mucking about with files and navigating to the apps I needed to write papers and do homework was a lot easier. GEOS really brought value to the Commodore for me, more than anything I had ever done before, but it was never as streamlined an experience as using the Mac.

Around 1993, I finally got a Mac at home (thanks, Mom) and the Commodore went into a closet. Many years later, 2000 I think, I sold the still-unused Commodore and all of my software at the MIT Flea Market to some random college student for $50. I never looked back.

Part of the appeal of getting into graphic design for me was that the Mac was so prevalent. I knew that if I got a job doing graphic design that I could probably get my company to buy a Mac for me to use. Honestly, if it weren’t for the Mac, I’m not sure what I would have done for a living. As I proceed down the path to getting my computer science degree, using a Mac has yet to hold me back and I see a lot of Macs in the classroom, so I look forward to another 30 years of use.

Happy birthday, Macintosh.

Loss of net neutrality? Ain’t gonna happen.

Anyone with half a brain, including the telcos, should quickly see that mucking with net neutrality is bad for business.

The graphic posted to Reddit that shows tiered web site access a la cable channels being the most (scarily) obvious. I’ll link to the graphic itself for convenience (but pardon for any broken links a few years from now):

Reddit-source speculative tiered access from Hell.

Let’s noodle around with the implications of that for a minute…

First, I have to wonder if this is a result, be it direct or otherwise, of privatization. The telephone system was basically built after World War II with the Rural Electrification Act. The telephone system, even though managed by commercial organizations, was still, at its core a public utility ultimately governed under FCC Title II. Broadband—which I define here as coax, fiber optic, ISDN, Verizon FiOS, Comcast, not phones—was not implemented by any such legislation near as I can tell. While the telcos may have gotten tax breaks and aid from the state and federal levels to help lay down cable, I don’t remember there being any legislation pushing out broadband to every part of the country for the Common Good. So, their network, their rules. Everyone needs to be taking notes on this if they aren’t already. But, at what point does a resource need to be so ubiquitous that it ought to fall under Title II? Take away the Internet wholesale out of the economy and what would happen next? Would the impact on the economy be dramatic enough to establish net neutrality in legislation even outside of Title II? I think these are fair questions.

Second, I can’t imagine that any company doing business on the web, in whole or in part, would be pleased in any way with the telcos holding court over what gets through simply because a customer can or cannot (or unwilling to) pay an extra amount on the bill. Even those websites that offer free services through advertising can’t be happy with this if all of a sudden there was even a sudden 10—nay, 5%—drop in ad revenue simply because their site is successful enough to be in a top tier. The forces of Google, Microsoft, Apple, Facebook, and Amazon combined would be too epic against Verizon, Comcast, Time Warner, Cox, and others of their ilk to even affordable to fight, not to mention every Chamber of Commerce in existence.

Third, I don’t see anything like the tiered pricing being manageable in any way. The domain space is massively huge; 112 million in .com alone. To throttle a list of even 1% of that is 1.12 million domains. There is not a workforce on this planet that can take on that task. The Reddit graphic lists less than 60. Granted, they are all heavy hitters, but then would the telcos then hire a sales staff to partner websites? How would such a deal be pitched to websites that is even remotely appealing? This has the same funky smell of those deals the NFL makes with cities where the city has to buy up any unsold tickets in the stadium the city built (and not even to avoid a broadcast blackout of the home game). Who would agree to such nonsense?

I can’t believe we’re even having this conversation, but I suppose we need to have it to reason out what is right and what is just pragmatically stupid. There are already some interesting responses that have come to light, notably those noted below, which are reporting tools of distributed/streamed video quality by ISP:

These are a great way to call attention to the issue of net neutrality in a way that has direct impact to the user. And away we go…

UPDATE: Just to be clear, none of this means that I think we shouldn’t bother fighting for net neutrality. I feel quite the contrary, actually, and that we should fight for it just so we don’t have the experience anything like tiered web access other than speed. The hit to the economy is just unreasonable. What I am trying to express here is that I don’t necessarily agree with the doomsayers that the Internet’s utility will be diminished to almost nothing. The telcos are already throttling certain traffic as the above noted websites imply, and the recent ruling does compel the telcos to reveal what kind of traffic throttling they will be utilizing. Net neutrality is important to everyone, but we are nowhere near “all is lost.”

The Federalist: The Death of Expertise

The death of expertise is a rejection not only of knowledge, but of the ways in which we gain knowledge and learn about things. Fundamentally, it’s a rejection of science and rationality, which are the foundations of Western civilization itself. Yes, I said “Western civilization”: that paternalistic, racist, ethnocentric approach to knowledge that created the nuclear bomb, the Edsel, and New Coke, but which also keeps diabetics alive, lands mammoth airliners in the dark, and writes documents like the Charter of the United Nations.
The Federalist: The Death of Expertise

About a year ago, I took a class that explored a number of issues related to this very topic, and I look forward to taking a similar class next year.

“And in truth, I’ve never known a man worth his salt who, in the long run, deep down in his heart, didn’t appreciate the grind, the discipline. The difference between a successful person and others is not a lack of strength, not a lack of knowledge, but rather… a lack of will.”
Vince Lombardi

ars technica: How QuarkXPress became a mere afterthought in publishing

Quark’s demise is truly the stuff of legend. In fact, the story reads like the fall of any empire: failed battles, growing discontent among the overtaxed masses, hungry and energized foes, hubris, greed, and… uh, CMYK PDFs. What did QuarkXPress do—or fail to do—that saw its complete dominance of desktop publishing wither in less than a decade? In short, it didn’t listen.
ars technica: How QuarkXPress became a mere afterthought in publishing

Much of what happened to Quark and Microsoft is now happening with Adobe. I am increasingly seeing criticism of Adobe’s painfully high prices for questionable updates (primitive 3D objects in Photoshop? Why?). The difference this time, however, is that there is no alternative on the horizon. If I recall correctly, InDesign was rumored for quite a while before release. Even if InDesign ended up being vaporware, the enthusiasm was palpable but Quark appeared to simply not give a shit what anyone had to say; Quark’s hubris was just astounding. Today, Adobe has deaf ears if only because they have no compelling reason to listen.

Ars Technica: Nintendo president hints at exploring smartphone gaming support

“We are thinking about a new business structure,” Iwata told the press, according to a Bloomberg News report. “Given the expansion of smart devices, we are naturally studying how smart devices can be used to grow the game-player business. It’s not as simple as enabling Mario to move on a smartphone.”

“We cannot continue a business without winning,” he continued. “We must take a skeptical approach [to] whether we can still simply make game players, offer them in the same way as in the past for 20,000 yen or 30,000 yen, and sell titles for a couple of thousand yen each.”
Ars Technica: Nintendo president hints at exploring smartphone gaming support

I’m willing to bet dollars to donuts that Nintendo has some skunkworks deep in the heart of headquarters where Mario, Zelda, and their colleagues are running freely on iOS and Android, if not also on desktops, waiting and figuring out the best way to roll it out. This would be just like the rumor I had read ages ago that Apple has most incarnations of Mac OS running on Intel chips the entire time they were manufacturing with PowerPC chips. To see the benefit of doing so is not hard.

If Zelda came to iOS I would snap that up in a second. My Wii has barely been touched since I started school, there have been three new consoles since I started, and I still have a long way to go. I really hope they are moving in this direction, though I can also understand the hesitancy of handing over 30% of revenue to Apple.