Twirl… THWAP went the towel, inspired by boy locker rooms. But there weren’t any showers or sweat, just the wide-gapped jaws of gleeful teenagers getting the Grandma treatment. She was at it again, my strong Grandma, probably saying “No junk!” or somesuch epithet that she conjured up that very moment; and me, marvelling at the spectacle. She passed this week, nearly ten years after one of her good days, our wedding day. She was an inspiration at interactions, people just wanted to talk to her everywhere she went; that the universe contains irony demonstrated by her ten plus year battle with the horrific Alzheimer’s. Memories? Plenty. I was in high school and some guy whose name escapes me but whose words remain implanted as an oral souvenir I file under “This is a day that I grew up” remarked that “Grandma” was the most popular kid at school. Before that day, I knew her as just Grandma, and then I realized she had become everyone’s.
Posted by amorris on November 13, 2014
Okay, so since Friday afternoon I’ve been playing with the very very interesting Google Apps Scripting (GAS), just because it was sitting there and I didn’t understand any of it, and I definitely saw its potential.
Then I went to a GAFE summit conference (I’m an ESL teacher by day) and saw what cool stuff cool people were doing, and I figured it was worth a bit of digging around. So I made a script. It was a pretty simple one, it took info out from an API and added rows in a sheet.
It took me three days to write. I blamed the development environment. I asked the author of the insanely great and probably insanely-difficult-to-maintain Docopus, if there was any legitimate options out there that allowed unit tests and proper debugging.
So I set out to figure out A Better Way. I hated that I had to click on a play button and not see immediate logs from my running program. I’m used to writing python code and seeing results immediately, no waiting around. Maybe it was because I am an amateur JS coder. Whatever. I hated it.
First of all, GAS scripting at the moment, as it is now, from within the browser feels a tick more mature than GUI scripting with AppleScript. Have you ever tried GUI Scripting with AppleScript? Just don’t. It’s a horrible environment, filled with traps and pitfalls. There’s like three guys in the world that do it all the time, and they know everything, and they dominate the help-focused mailing lists, okay so I’m exaggerating, but not all that much.
Here it is:
You can install node.js and use its require() function in which you define dummy objects that have the same name as those that you use in the GAS API, such as Logger, and define methods that do the terminal equivalent. So that Logger object’s log method would just call console.log. So a Spreadsheets object whose getActiveSpreadsheet method returns a Sheets object which is just a fancy container for an array. That Sheets object’s appendRow() function just pushes to the contained array structure. Simple as.
This will also allow unit testing and other nice things, because making dummy data that just tests integrity of the logic and the assumptions relatively painless. And since you are on the local machine, typing is fast Also, since you’re using node.js, you have full access to a completely functional command-driven debugger.
Now. Not sure how this will scale for scripts that use all sorts of GAS API objects. I reckon it wouldn’t be too much fun to have to make new ones for each project, but this has to be better than waiting around for a Google Doc to open just so I can click on Script Editor just so I can choose the play button and wait 10 seconds to launch, and wait for it to finish before I see any log items.
Also. It’s not strictly copy-n-paste because you still need some variable assignment boilerplate, but the define function removes the vast majority of the magical “on-the-local-machine-only” code away from that blasted browser.
Posted by amorris on November 10, 2014
ScreenFlow 5 is out in the Mac App Store, on sale cheap. If you ever do a screencast, this is like the iTunes for screencasts. Yes please.
Posted by amorris on November 6, 2014
Oh man. There is a lot of truth in this:
Think back to a multi-day conference or long PD day you had and remember that feeling by the end of the day – that need to just disconnect, break free, go for a run, chat with a friend, or surf the web and catch up on emails. That is how students often feel in our classes, not because we are boring per se but because they have been sitting and listening most of the day already. They have had enough.
Two years ago I went on a conference that was heavy on information overload and so instead on the last day I “had enough”, and instead spent the day in a coffee shop where I met other teachers who were using some online tools that I knew something about and we shared. I remember the comments, and even took some of those and developed stuff based on what I saw. They were English teachers using Moodle, and it was the best thing that happened to me during the whole conference, and it happened when I literally walked away from it.
Sitting around all day taking in information is hard. I often complain to my wife about it when it happens, so I always think that I’m probably a bit more sensitive than others are, but I think we can all relate to having a PD day where it didn’t actually felt like you did anything, and when it’s over you are so completely wiped out from it all anyway.
Hopefully my use of humor in the classroom breaks up things quite a bit. But still, this post has opened my eyes a bit. That’s it. I’m going to do a lot more drama in my lessons from now on. <grrr></grrr>
Posted by amorris on October 28, 2014
On any given day, at my workplace, I use google services probably billions upon billions of times. Non-existing numbers like that is nice to have, though … isn’t it? … because what is really meant is that I use Google for very nearly every click of the mouse, a hit of the return, and I quickly lose count, and that’s the utility of using silly numbers (c.f. “zillions“).
Anyway. So I use Google services quite a bit. Our internal DNS server just forwards to Google DNS, so then I’m probably using Google Services 20 times per web access at least. In case that stuff right there just went over your head, I’m just pointing out that every time I visit a website (=1 web access) there are 20 additional “websites” that are accessed after that, because any one website has dependences on other services.
Which brings me (finally) to the article I just read, “Peak Google“, which points out that Google may well be like IBM back in its heyday, being really profitable and really scary, but in fact with its business model it actually probably has peaked, and we all know what happens when things peak. It’s a more sophisticated argument than that, so be sure to take a close read yourself.
It’s fascinating to me because to my mind the only truly sustainable software model is that of Open Source. I mean, they recently found a bug that was 25 years old and everyone just sort of laughs it off. Yes I know that’s a pretty unsophisticated summary of what happened with the Shellshock bug, but that vulnerability really is an 11 out of ten, but the world goes on because generally it’s a sustainable model. Imagine if Apple software had a bug like that. Babam.
With Open Source, no one gets fired, no stocks to worry about, just fix it, and get everyone to patch it. Probably it’ll cause adjustments in the community so that such a thing can happen again (alas, but it will!). On the contrary there are a zillion lines of Google code locked behind a company wall and what happens to those if-slash-when Google is just an IBM-of-today rather than the IBM-of-the-eighties?
Posted by amorris on October 23, 2014
Early days, and all that, but one of the main threads that is coming out in the aftermath of the flight MH17 is that Putin provided these pro-Russian forces with some pretty advanced weaponry. Take a look at Josh Marshall, for example, whose key paragraph is:
The audio tapes posted by The New York Times might as well be from some future Russia-based version of Waiting for Guffman or Best in Show, a comical rendering of rustics and morons stumbling into an event of vast carnage and international consequence mainly because they’re hotheads and idiots – the kind of people no one in their right minds would give world class weaponry to. It’s like finding some white supremacist/militia types on their little compound in the inter-Mountain west and giving them world class missile launchers and heavy armaments.
Not sure what evidence there is that these guys are morons and/or hotheads, and that no one in their right minds would give weapons to, but America has been giving some pretty stupid people, like, say, Saddam Hussein, and any other pro-Whatever forces some weapons to do some stupid things with.
I’ll leave that train of thought, first because America didn’t directly give Hussein actual weapons, but instead arranged to make sure he had some, and secondly because the bigger train of thought is this:
All this weaponry that is laying around, eventually will be put into the hand of a “moron” or just plain troublemaker, who could do some pretty stupid things with it, including the initial act of a war that results in the death of our very planet.
Posted by amorris on July 19, 2014
So people are thinking that there will be an iWatch out from Apple come this October. I haven’t seen a picture of anything that I think:
- I would wear, or
- That someone else would wear, either (for more than a week)
If Apple is doing an iWatch, given their design process and philosophy, this is what it will be:
- It will look more like clothes than a gadget.
- Bracelet, or necklace-like. In other words, not a watch.
- Water proof to x feet. Think swimming and showering with the thing.
- No camera at all. Not in version 1.
- Connects to your other devices, an extension of your other devices.
- Minimal user interface, if any at all.
I don’t for a second think that Apple is going to release a product that will have an X or Y feature that everyone will want. Apple designs from the ground up, in other words, first and foremost it will be something smart that is worth wearing once or twice, and then after that there will be some killer functions that people will keep wearing it for.
Posted by amorris on June 29, 2014
Turns out that there are real reason to cast doubt on the idea that the universe is probably teeming with intelligent life. Sure, maybe it has loads and loads of microbes, but human-like species? We may well be the first, and almost certainly the only one in our galaxy.
Posted by amorris on June 29, 2014
I signed up as an Apple Developer, because I had heard some really amazing things coming out of WWDC 2014. Apparently, and I’ll let readers find out for themselves, that conference pretty much ushered in The New Apple. The amount of stuff they unleashed to the world is just so comprehensive, and so futurist-looking, that if you’re a developer-minded person, you just have to open your mouth in awe.
Previously, I hadn’t been attracted to the Mac development stuff for one major reason: Objective-C. I just can’t stand writing C code. I know what pointers are, and I know why we’re always checking for nil, but my brain doesn’t like those sort of low-level stuff. That’s why I do everything I can in Python: It’s just the best high-level language there is. Plus, Objective-C has all those horrible names with NS prefixes all over the place, another thing that drives me mad. Reading Objective-C code just makes it so hard. I’m a teacher and anything I develop I don’t have spare hours unpacking stuff. Just let me code it up.
And now Apple has launched Swift, and like most Apple programmer geeks, I consumed the book, and, not only is it Python-like but it levels the playing field considerably. If you’re a guy that’s always wanted to enter the Mac development ecosphere (hand goes up real high) but has been traumatised at the thought of having to dive into unlearning everything you know just to start learning everything again, now is the time to do it.
It’s funny, because I got serious into Python when they launched Python 3, which was a similar situation.
I think this is my way of announcing to the world that I’m going to be making a Mac program within the year. Hmm.. I wonder what sort of project I might work on.
Posted by amorris on June 14, 2014
I never heard of MetaFilter, but reading about their imminent demise makes for interesting reflection. Think about it, entire ecosystems depend entirely on two Google services:
And those services are:
The other week a colleague was ruminating on the fact that they are starting to avoid Facebook because, well, it was getting to be just too much. Sort of like Google, entire livelihoods depend upon services that Facebook provides, but all it does is solve a fairly simple problem: Staying in touch with people. Google solved the “how do I find stuff on the internet” problem.
So, I coin here term ‘super pillar’ as an online service where millions of other services depend upon it.
The same colleague asked out loud “is there anything on the internet that is forever”… to which I replied “email”. However, despite email’s horrible reputation for millions of spam bots everywhere, it truly is one of the few things that is “forever” on the Internet. And email is:
Although it doesn’t quite qualify as Open Source, in a way it is the first successful “open source-ish” project that the Internet had, and it remains today as the foremost service the Internet offers. Interesting to think that both Google and Facebook have answers to the problems that email solves…
Posted by amorris on May 24, 2014