All posts by Philip Regan

Bloombergview.com: Missing E-Mail Is the Least of the IRS’s Problems

Such policies indicate either an agency that is not concerned with preserving good audit chains or one that has an extremely penny-wise, pound-foolish approach to IT policy. At prevailing wages—and hard drive prices—it is a waste of money to force even your lowest-level employee to spend time painstakingly deleting or archiving e-mails. If IRS staffers don’t have anything better to do with their time, then the IRS needs fewer staffers, not stricter mailbox policies.

In the case of a government agency, however, it’s especially troubling. Records pertaining to agency decisions are supposed to be systematically archived forever. I’m not saying that the IRS’s e-mail retention policy is uniquely bad in the federal government, only that whatever the current practice is, the IRS did not preserve nearly as much as one would like in a representative, transparent democracy.
Bloombergview.com: Missing E-Mail Is the Least of the IRS’s Problems

Apple hijacks Unix headers into Xcode in Mavericks

I am currently taking a class on Unix systems programming. While following along with lecture, the professor stated that almost always the header files needed for our type of work are located at /usr/include. However, that directory does not exist on my brand new Mavericks MacBook Pro. I have learned (after much searching the web) that the header files are now here:

/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.9.sdk/usr

After doing some more research, I found that to get those Unix header files back into /usr/include, one has to install Xcode’s command line tools. But, in order to get those, one has to be a registered member on Apple’s developer website. Not necessarily a paying member, but registered.

I was once a paying member for Apple’s developer tools but I gave it up because I was not actually using everything that was made available. My focus changed and the annual $99 was going to waste. In fact, because of that and that Apple has so much of their Cocoa documentation available externally for free, my need to log into their website has decreased over time to my not touching it in over a year. Now I am jumping through the hoops to figure out which account I was using and what was the password, but that is turning out to be harder than expected for a variety of reasons on Apple’s side, the servers not propagating my Apple ID resets to the developer site being one of them. Yes, I should have done a better job of recording my information, but a simple password reset shouldn’t be this hard either. At this point, I will likely just create a new account solely for getting me what I need.

I have no idea when Apple hijacked the header files, and i can understand the logic and convenience of doing so for tool updates, but knowing what little I do about Unix, hijacking seems to be anathema to the Unix culture. Apple has made my morning nothing but hassle trying to get this fixed and I will have an assignment due soon. So, cheers for that, Apple.

UPDATE: I wound up creating an account solely for the developer account, and it was, as expected, faster and easier than mucking about with a bunch of password resets. Once I did that, getting the command line tools was not straightforward though not hard (Xcode > Open Developer Tool > More Developer Tools… which then kicks you over to a downloads page on the developer website, with registration required for access).

As for my comment about the new default header location being anathema to Unix, I realize now that’s not necessarily true. If anything, Apple can do whatever they want with their distro. But the fact remains based on everything I have read so far that there are some clear expectations about there things ought to be and the header file location is one of them. But at least now I can establish a workflow where I can use the muscle of Xcode to develop and then confidently test outside before submission.

Facebook Paper

Paper presents user updates as “stories”: captions overlaid on large-format photos, auto-playing videos, and even long or short text screeds all in an edge-to-edge, full-screen format. The default “section” in the app is the user’s Facebook news feed, but users can pull new sections up from a set of cards, such as “Headlines” or Tech,” and browse between them in one pane.

“Each section includes a rich mix of content from emerging voices and well-known publications,” Facebook says. This gives the biggest clue to the real intended creators for paper: brands, be they news outlets or celebrities.
Ars Technica: Facebook’s Paper is Facebook without the Facebook

Oh, look, yet another proprietary digital publishing platform targeted at publishers. How quaint. Here, let me add this to my pile of fifty or so I have over here.

Applescript: Getting unique items in a list update

I love getting questions about the contents of, or topics related to, my site. Most recently, I was emailed a question about one of the older functions I have in the Applescript section. In particular, it was one for getting unique items in a list. Here’s the function…

on GetUniqueItems(sourceList)
	set itemCount to (get count of items in sourceList)
	set compiledList to {}
	--get the first item to kick off the list
	repeat with x from 1 to itemCount
		set itemFound to false
		set itemX to item x of sourceList
		if x < itemCount then
			repeat with y from (x + 1) to itemCount
				set itemY to item y of sourceList
				if itemY is itemX then set itemFound to true
			end repeat
		else
			repeat with y from 0 to (itemCount - 1)
				set itemY to item y of sourceList
				if itemY is itemX then set itemFound to true
			end repeat
		end if
		if itemFound is false then
			set end of compiledList to itemX
			exit repeat
		end if
	end repeat
	--if no items are found
	if (get count of items in compiledList) is 0 then
		return compiledList
	end if
	--find the rest of the unique items
	repeat with x from 1 to itemCount
		set itemFound to false
		set itemX to item x of sourceList
		set resultCount to (get count of items in compiledList)
		repeat with y from 1 to resultCount
			set itemY to item y of compiledList
			if itemY is itemX then set itemFound to true
		end repeat
		if itemFound is false then set end of compiledList to itemX
	end repeat
	return compiledList
end GetUniqueItems

The question was focused on why I go through the source list more than once. As soon as I saw the function after the question, I knew they were right that something was wrong. My answer essentially explained that this was one of the first useful home-brewed functions I wrote, and since it worked, it stuck, as working code is wont to do. But, honestly, I’ve reviewed this code a dozen times and it has me completely baffled as to how it works. I think there is even a whole block on there that can come out and nothing would change.

I started writing my first Applescripts in 2005, which was also my first serious foray into programming. The last time I thought about if...then statements was in high school writing BASIC for the Commodore 64 in high school. This function, according to my notes, was written in 2007 when my needs and skills were becoming more robust. This function is currently in use in several scripts today with nary an error. But, nine years of experience later and immediately that function is absolutely cringe-worthy (though only to a point considering when I wrote it), so I rewrote it. Et voilà…

on getUniqueItems(src)

	set srcCount to (count src)

	set unq to {}

	repeat with x from 1 to srcCount

		set srcItem to item x of src

		set unqCount to (count unq)
		set match to false

		repeat with y from 1 to unqCount
			set unqItem to item y of unq
			if srcItem = unqItem then
				set match to true
			end if
		end repeat

		if match is false then
			set end of unq to srcItem
		end if

	end repeat

	return unq

end getUniqueItems

Hindsight being 20/20 and all that, this a “duh!” moment. There are a couple important things to note about this.

First, my test data for these types of functions is reliable but small. This is O(n2) on the low-end of things, but almost invariably Applescripts very rarely ever deal with data sets large enough where O(nk) has enough of an impact to get a coffee and sandwich while waiting. My personal experience and preference is that if that were the case, then I need to go find a more appropriate tool for data prep.

Second (and last), this block…

set unqCount to (count unq)
set match to false

repeat with y from 1 to unqCount
	set unqItem to item y of unq
	if srcItem = unqItem then
		set match to true
	end if
end repeat

if match is false then
	set end of unq to srcItem
end if

…could be replaced with this common Applescript hook…

if unq does not contain srcItem then
	set end of unq to srcItem
end if

The only problem with this, as I see it, is when trying to compare custom data types as opposed to core data types. This is great if I only ever worked with Applescript’s core data types, like string, number, date, and the like. But almost all of my Applescript code has been targeted to Adobe’s Creative Cloud, which brings a wealth of custom objects with loads of properties with which to work. I think leaving in the extra code (and any possible hits on speed since this not baked into the language like contains) is reasonable for the sake of easy customization later. By way of example, this…

-- compare memory addresses
if srcItem = unqItem then

…becomes this in a pinch…

-- compare object properties
if foo of srcItem = foo of unqItem then

…or even…

-- deep comparison
if my customCompare(foo of srcItem, foo of unqItem) then

So, a bit of extra code for the win. I suppose I could set up a hash table implementation to improve upon the O(n2)O(nk) range of complexity, but with Applescript work, again, it’s really not worth it.

That was a really great question on a number of levels. Not just that this shows that people actually read the site on occasion and finds something useful, which is the core goal of the site (this blog is really more of just a place to vent that offers me more flexibility than other blogging sites or social media) but also to be compelled to review and improve old code and find just how far I have advanced over the years. Win-win.

The Macintosh is 30.

The Macintosh is 30. Like so many other people I am asking the same question: How the hell did that happen?

I remember the first time I used a Macintosh. I was a junior in high school in 1988 taking a basic drafting and architecture class. In the corner of the classroom was a small bank of Macs all loaded with illustration and CAD programs. Students who achieved a high enough grade got to use those computers for their projects. I was getting an “A” (one of my precious few at the time) and so I scored some much-ballyhooed computer time.

When I sat down in front of the Mac, I had no idea what it was. I don’t think I had even heard of it until I walked into that class. This will likely sound cliche but the whole experience was intuitive right from the start, from creating a new file, drawing the lines and shapes with the mouse to create the machine part plans, to saving the file, and the rest. I was always good at visualizing things, and the Mac was the first computer I had ever used that spoke my internal language. I could use my hand to draw and I was finally able to make a connection between the files I saw on-screen and the data on the disks. I loved using the Mac.

At home, I had a Commodore 64 and all I knew how to do with it was really basic BASIC programming and run games, both bought and pirated. When the 8-bit GEOS operating system was released, my use of the Commodore really took off because all of a sudden it could be used just as easily as the Mac. All of these disparate applications were finally unified in a single interface. I had a joystick as opposed to a mouse—if a mouse existed to use with GEOS I never knew about it—but that didn’t matter. Mucking about with files and navigating to the apps I needed to write papers and do homework was a lot easier. GEOS really brought value to the Commodore for me, more than anything I had ever done before, but it was never as streamlined an experience as using the Mac.

Around 1993, I finally got a Mac at home (thanks, Mom) and the Commodore went into a closet. Many years later, 2000 I think, I sold the still-unused Commodore and all of my software at the MIT Flea Market to some random college student for $50. I never looked back.

Part of the appeal of getting into graphic design for me was that the Mac was so prevalent. I knew that if I got a job doing graphic design that I could probably get my company to buy a Mac for me to use. Honestly, if it weren’t for the Mac, I’m not sure what I would have done for a living. As I proceed down the path to getting my computer science degree, using a Mac has yet to hold me back and I see a lot of Macs in the classroom, so I look forward to another 30 years of use.

Happy birthday, Macintosh.

Loss of net neutrality? Ain’t gonna happen.

Anyone with half a brain, including the telcos, should quickly see that mucking with net neutrality is bad for business.

The graphic posted to Reddit that shows tiered web site access a la cable channels being the most (scarily) obvious. I’ll link to the graphic itself for convenience (but pardon for any broken links a few years from now):

Reddit-source speculative tiered access from Hell.

Let’s noodle around with the implications of that for a minute…

First, I have to wonder if this is a result, be it direct or otherwise, of privatization. The telephone system was basically built after World War II with the Rural Electrification Act. The telephone system, even though managed by commercial organizations, was still, at its core a public utility ultimately governed under FCC Title II. Broadband—which I define here as coax, fiber optic, ISDN, Verizon FiOS, Comcast, not phones—was not implemented by any such legislation near as I can tell. While the telcos may have gotten tax breaks and aid from the state and federal levels to help lay down cable, I don’t remember there being any legislation pushing out broadband to every part of the country for the Common Good. So, their network, their rules. Everyone needs to be taking notes on this if they aren’t already. But, at what point does a resource need to be so ubiquitous that it ought to fall under Title II? Take away the Internet wholesale out of the economy and what would happen next? Would the impact on the economy be dramatic enough to establish net neutrality in legislation even outside of Title II? I think these are fair questions.

Second, I can’t imagine that any company doing business on the web, in whole or in part, would be pleased in any way with the telcos holding court over what gets through simply because a customer can or cannot (or unwilling to) pay an extra amount on the bill. Even those websites that offer free services through advertising can’t be happy with this if all of a sudden there was even a sudden 10—nay, 5%—drop in ad revenue simply because their site is successful enough to be in a top tier. The forces of Google, Microsoft, Apple, Facebook, and Amazon combined would be too epic against Verizon, Comcast, Time Warner, Cox, and others of their ilk to even affordable to fight, not to mention every Chamber of Commerce in existence.

Third, I don’t see anything like the tiered pricing being manageable in any way. The domain space is massively huge; 112 million in .com alone. To throttle a list of even 1% of that is 1.12 million domains. There is not a workforce on this planet that can take on that task. The Reddit graphic lists less than 60. Granted, they are all heavy hitters, but then would the telcos then hire a sales staff to partner websites? How would such a deal be pitched to websites that is even remotely appealing? This has the same funky smell of those deals the NFL makes with cities where the city has to buy up any unsold tickets in the stadium the city built (and not even to avoid a broadcast blackout of the home game). Who would agree to such nonsense?

I can’t believe we’re even having this conversation, but I suppose we need to have it to reason out what is right and what is just pragmatically stupid. There are already some interesting responses that have come to light, notably those noted below, which are reporting tools of distributed/streamed video quality by ISP:

These are a great way to call attention to the issue of net neutrality in a way that has direct impact to the user. And away we go…

UPDATE: Just to be clear, none of this means that I think we shouldn’t bother fighting for net neutrality. I feel quite the contrary, actually, and that we should fight for it just so we don’t have the experience anything like tiered web access other than speed. The hit to the economy is just unreasonable. What I am trying to express here is that I don’t necessarily agree with the doomsayers that the Internet’s utility will be diminished to almost nothing. The telcos are already throttling certain traffic as the above noted websites imply, and the recent ruling does compel the telcos to reveal what kind of traffic throttling they will be utilizing. Net neutrality is important to everyone, but we are nowhere near “all is lost.”

The Federalist: The Death of Expertise

The death of expertise is a rejection not only of knowledge, but of the ways in which we gain knowledge and learn about things. Fundamentally, it’s a rejection of science and rationality, which are the foundations of Western civilization itself. Yes, I said “Western civilization”: that paternalistic, racist, ethnocentric approach to knowledge that created the nuclear bomb, the Edsel, and New Coke, but which also keeps diabetics alive, lands mammoth airliners in the dark, and writes documents like the Charter of the United Nations.
The Federalist: The Death of Expertise

About a year ago, I took a class that explored a number of issues related to this very topic, and I look forward to taking a similar class next year.