Joshua Paling

Joshua Paling Joshua Paling 2018-10-14T11:00:00+11:00 Joshua Paling I'm blogging sometimes on medium 2018-10-14T11:00:00+11:00 2018-10-14T21:56:55+11:00 Article Author <p>Follow my infrequent posts from now on over at <a href="">medium</a></p> Three rules for good quality software 2016-02-04T11:00:00+11:00 2016-02-04T21:43:22+11:00 Article Author <p>I was once told three rules for good writing:</p> <ul> <li>Stern logic</li> <li>Clear connections</li> <li>Utter simplicity</li> </ul> <p>The same apply to writing software. Forget design patterns and best practices — they're a means to an end, not an end in themselves. If you're code adheres to these three rules, it's good code. No matter what anyone says.</p> Summary of Sandi's Keep Ruby Weird keynote 2015-12-08T11:00:00+11:00 2016-01-04T22:23:07+11:00 Article Author <p>Sandi Metz gave a <a href=";">great keynote</a> at <a href=""></a>. You should watch it.</p> <p>Here's a summary for my own future reference. She goes through 3 interesting psychological experiments, and then covers the take-home points with regards to software dev. Here they are:</p> <h2 id="the-line-length-study-asch-on-conformityhttpsenwikipediaorgwikiaschconformityexperiments">1. <a href="">The Line Length Study (Asch on Conformity)</a></h2> <p>A subject is asked to say which of 3 lines is the same length as a given "reference line". Several confederates to the left of the subject answer incorrectly. A third of subjects will answers incorrectly, although most do so knowing they're wrong, just to conform.</p> <p>Add in just one confederate who answers correctly, and the subject will be much more likely to answer correctly (ie, conformity goes down).</p> <p>In another variant, the subject is told he is late and must therefore write his own answer on paper rather than speak it out. Again, conformity goes down.</p> <p><strong>The Take-home points:</strong></p> <p><em>Want conformity (ie, everyone give same answer)?</em></p> <ul> <li>in groups of 3 or more</li> <li>elicit opinions publicly</li> <li>allow most authoritative people to speak first.</li> </ul> <p><em>Prefer diversity (ie, more than one idea to choose from)?</em></p> <ul> <li>Record opinions privately before discussing publicly</li> <li>Ensure equal air time for all speakers</li> </ul> <h2 id="the-electric-shock-study-milgram-on-obedienceauthorityhttpsenwikipediaorgwikimilgramexperiment">2. <a href="">The Electric Shock Study (Milgram on Obedience/Authority)</a></h2> <p>Stanley Milgram was interested in how easily ordinary people could be influenced into committing atrocities for example, Germans in WWII.</p> <p>A "teacher" (the subject) would test a confederate in a different room on word pairs. The teacher is asked to give the student an electric shock of increasing strength for each wrong answer, using a machine with settings labeled from 15 volts (slight shock) to 450 (danger - severe shock).</p> <p>If subjects resisted perpetrating another shock, they were prompted with:</p> <ul> <li>Prod 1: please continue.</li> <li>Prod 2: the experiment requires you to continue.</li> <li>Prod 3: It is absolutely essential that you continue.</li> <li>Prod 4: you have no other choice but to continue.</li> </ul> <p>If they continued to resist, they could stop.</p> <p>Despite what experimenters anticipated, two thirds of subjects continued though to carry out the highest level of shock, despite the student screaming in pain, and even when the student went dead silent at the higher shock levels.</p> <p>Variants which decrease level of obedience are below:</p> <p>&lt;%= image_tag 'blog/2015/obedience.png' %&gt;</p> <p>The two main themes which decrease obedience are: decrease percieved authority, and increase closeness to victim / feeling of agency in subject.</p> <p><strong>The Take-home points:</strong></p> <p><em>Want obedience?</em></p> <ul> <li>Have hovering authority figure giving commands</li> <li>Isolate actor from victim (eg. don't let devs talk to users who are affected by bad software)</li> <li>Isolate actor from others who disobey (eg don't let devs talk to each other)</li> </ul> <p><em>Prefer independent thought? (ie, best ideas regardless of effects of conformity)</em></p> <ul> <li>Be hands-off (increase distance to authority figures)</li> <li>Reduce distance to victim (devs talk directly to people affected by their decisions)</li> <li>Allow actor to collaborate with other independent thinkers</li> </ul> <h2 id="the-smoke-filled-room-study-bystander-effecthttpsenwikipediaorgwikimilgramexperiment">3. <a href="">The Smoke Filled Room Study (Bystander effect)</a></h2> <p>A subject waits, filling out a form, in a room full of confederates. Smoke starts coming out the vents. The confederates are instructed not to react at all. The subject just follows their lead. Even when the smoke alarm goes off, the subject stayed (for an average length of 13 minutes).</p> <p>When placed alone in the same situation, subjects left the room quickly.</p> <p>In a similar study, if someone lays hurt on the street, people will just pass by. But once one person stops to help, others will also stop.</p> <p><strong>The Bystander effect:</strong> <em>In an emergency, the more onlookers there are, the less likely it is anyone will come to your aid.</em></p> <p>This is like maintaining a gem, asking for help, and getting radio silence back.</p> <p><strong>The Take-home points:</strong></p> <p><em>Want inaction?</em></p> <ul> <li>Direct requests to a group.</li> </ul> <p><em>Want action?</em></p> <ul> <li>Direct requests to an individual.</li> </ul> <p>The opposite is also true; if you're aware of the bystander effect, you can break the conformity of the group and be the one who steps up. If you do volunteer to help, you make a new group and others will join it.</p> <p><strong>As a species, we are hard-wired to conform, obey, and derive rule from the behaviour of others in the group. It is hubris to imagine you're immune to this rule. You're affected like everyone else. These tendencies lead us to act in groups, in ways that do not reflect our best selves as individuals. If you refuse to believe you'll be affected, then you're doomed to do so. The only way to combat these tendencies is to be aware of their existence. In order to be the best group we can be, each has to act as though they're the only one here.</strong></p> Spring stop 2015-11-30T11:00:00+11:00 2015-11-30T15:24:38+11:00 Article Author <p>A note for my own future reference: if you're using rails with spring, and you add a new class in a new file under a new folder in your /app directory (eg, adding your first service to an app at <code>/app/services/my_service.rb</code>), Rails will not find that new class until you've run <code>spring stop</code>.</p> <p>If you're adding a new file to an existing subfolder under <code>app</code>, you're fine.</p> Never, ever, ever user Rails default scopes 2015-11-18T11:00:00+11:00 2015-11-18T14:07:36+11:00 Article Author <p>I'm <a href="">not</a> <a href="">the</a> <a href="">first</a> <a href="">person</a> to recommend against default scopes.</p> <p>Just wanted to throw one more warning in the pot. My previous convention was that I would ONLY use default_scope for ordering:</p> <pre><code class="language-ruby">class Customer &lt; ActiveRecord::Base default_scope -&gt; { order('name ASC') } end </code></pre> <p>Seems pretty harmless, right? Wrong.</p> <p>I've spent the last few hours tracking down a bug. Turns out the cause is that when you do <code>Customer.last</code>, Rails orders by your default scope, if one exists. So, you don't get the last inserted record, you'd get (in this case) the last alphabetically ordered. You can work around it by using <code>Customer.unscoped.last</code> - but sooner or later, you will forget to do this, and it won't be fun.</p> <p>Never, ever, ever use Rails default scopes.</p> The bare minimum developers should know about SEO 2015-11-10T11:00:00+11:00 2015-11-26T09:22:53+11:00 Article Author <p><strong>Slides and transcript of my recent talk given at both #rorosyd and <a href=""></a> meetups this November:</strong></p> <script async="" class="speakerdeck-embed" data-id="bc8e2347ddd44b8b8c25946312ff95e0" data-ratio="1.33333333333333" src="//"></script> <p>Hi I'm Joss, and I am not an SEO. However, I know enough to be able to call out really bad SEO advice when I see it, and that's something I feel every developer could benefit from, because in the span of your career, there's a non-trivial chance you'll come across it. I should warn you that I've compromised flow and ease of following along for just cramming a lot of info into 15mins, so I'll move quickly, but I'll post stuff on my blog afterwards.</p> <p>A quick bit of prerequisite knowledge before I start: Nowadays, when Google Bot sees your site, it basically sees it through a headless browser. It is aware of CSS and some javascript - so things like tabs, expandable sections, it's aware of those.</p> <p>The nature of SEO, the fact that business owners are so desperate to rank on the first page of google, combined with the fact that SEO is relatively mysterious to most people, and the fact that it's so hard to accurately track results from implementing website copy changes and other SEO advice, through to improved search results, and then on to improved conversion rates, and ultimately, cash in the bank and an improvement for the businesses bottom line - the fact that whole pipeline is so tricky to track, means that SEO can be a very smoke-and-mirrors industry, and while there are good SEOs, there's certainly others who are basically just good talkers (which, by the way, I'm becoming increasingly convinced is the single most important skill to have if you want to do well in any business!). Anyway… I aim to give you a sort of top-level, common-sense, view of SEO from 1000 ft. And we'll start with a very brief and drastically over-simplified example of iteratively building a search engine.</p> <p>So, world domination and conspiracy theories aside, Google has one agenda. When you search for Pizza, it wants to give you the worlds best resources for pizza. So imagine you're building a search engine from scratch way back when the internet was a baby. And you think "this is simple. I'll just count the number of times a page mentions pizza, and the one that mentions it the most comes up first.". But Tony, from Tony's Pizza, catches on to this. So he puts pizza pizza pizza hundreds of times, in white text against a white background, at the bottom of the page, just to get to the top. And get to the top he does. He ranks number 1 for pizza. But you catch on to him and realise: it's not a genuinely great resource, he's just managed to find a loophole in your algorithm. So you go back to the drawing board, and say I'm actually going to penalise sites that do that spamming trick. So Tony's Pizza drops off the first page. And you say "what I'm actually looking for, is an optimal keyword density.</p> <p>On the best sites, Pizza will make up 3% of the words on the page. Of course, people catch on and cheat that too, making sure their sites hit the optimal keyword density, even if it means making the reading experience more clunky for the end user. So you go back to the drawing board, and say "well, there's nothing magic about 3% as a keyword density. It's probably more of a bell curve situation. And if a keyword appears in the main heading of the page, or in the first paragraph, that should count for a bit more than a word down the bottom. And if a word appears in the domain name itself, like, that counts for even more, and so on. Of course, people catch on to these changes too, and the cycle continues.</p> <p>The specifics I mentioned are drastically over simplified examples, but there is this constant cat-and-mouse game in SEO, where Google's constantly trying to make their algorithm more bullet-proof, and SEOs are constantly trying to figure out how the algorithm works, both to gain better understanding of what google sees as a good resource, and potentially also, to look for loopholes - shortcuts to the top.</p> <p>Practices falling under the former umbrella, understanding what google sees as a good resource, and trying to make your site one, are recommended by Google, and are sometimes referred to as "white hat SEO". The later - cheating the algorithm - is referred to as "Black hat SEO". It can work, but it can often work very well short term, then result in bad results long term - in some rare cases even a banning from Googles results.</p> <p>So if you only remember one thing from this talk, make it this: Google's algorithm is really good now, and getting better all the time. It takes into account over 200 factors when ranking a page, and there's no silver bullet, or easy cheats. The simplest, and most future proof approach to SEO is to actually make your site a genuinely great resource for the terms you want to rank on. If it's pizza, that might mean providing guides to how to make your own pizza dough, reviews of the best at-home pizza ovens, reviews of the best local pizza restaurants, and so on. Make your site the kind of site that users would WANT to see at the top of their search results.</p> <p>Now, back to building our simplified search engine.</p> <p>One of the things that have been used to rank sites for a long time, even prior to google, is links. Links are like votes. If someone links to a site, it's like they're saying "hey, this is a good resource, check it out". So you've got the same cat/mouse game with links, of course. In our over-simplified example, you'd start by simply counting the number of links. Most links wins. But then people open "link farm" sites, where you can pay a small fee, and get hundreds of links from these link farms and online directories. So you develop a means to rank links on quality - higher quality links count for more votes. Links are still a very central part of SEO, and google's constantly refining how it uses links to determine good quality sites. At one point, the anchor text of links was very important to google.</p> <p>So in our pizza example, we'd be trying to get people to link to our pizza site with the anchor text "best pizza" or something like that. People had some fun with this, and they link-bombed George W Bush's site with the anchor text miserable failure, so in 2004 if you googled that term, Bush came up #1. In response to link pranks like this, in 2007 google released an update to it's algorithm that basically stopped them from working, which is a shame because I'm sure we could have had some great fun with Tony abbott. Anyway, links are very important, like I said, so I'll spend some time now talking about links.</p> <p>The #1 rule: you are not allowed to pay for links in an attempt to manipulate your ranking. People do do this. It does work, but if google's algorithm suspects you've paid for links, it'll raise a flag, and someone will look into the situation manually, and you can potentially be banned from search results. Because essentially you're paying to look like a good resource when you're not actually one. Of course, paying for ads is fine - what's not OK is paying for links which are strategically placed and designed to manipulate your ranking rather than simply have people click on them. Soliciting unpaid links in a similar manner is absolutely fine, that's just called marketing. All businesses should do it, and google has no problem with it.</p> <p>Next point, not all links are created equal. What if Tony's Pizza has an inbound link coming from Should that rank his site any higher? Does that lend credibility, or take it away? If anything it should rank his site lower. What about a link from something like yellow pages online. Just a big directly of businesses with links to each one. Well, Google will count for something, but not much.</p> <p>What about if a very popular and highly respected site like the SMH's food and lifestyle section, did an article on the best pizza in Sydney, and links to Tony's Pizza. Well that's huge, right? That's hundreds of times more of a vote than the yellow pages link. What if that same article also linked to 4 other highly popular pizza sites? Google looks at stuff like that. And that'd put Tony's in the same company as those other great resources, so that adds credibility and boosts rankings too.</p> <p>There's two things to be aware of from this example. First, links from high ranking sites are much more influential. Second, links from sites on topics relegated to your site much hold more weight that others. This idea of related sites is sometimes described as your "online neighbourhood". The "neighbourhood" of a pizza site is other pizza sites, primarily, also other food sites, other italian sites, perhaps, and so on. So if a food blogger reviewed your site, that'd be a good link too.</p> <p>One thing I hope is clear to everyone is just how common-sensical all the points I've mentioned so far are. Of course sites in your neighbourhood should hold more weight. Of course a link from SMH should be better than one from cheap viagra.</p> <p>Now, for lack of time I'll move through some other points on links quite quickly:</p> <p>varied anchor text in inbound links is good, because it looks natural. If 80% of inbound links to Tony's Pizza had the exact same anchor text, say "best pizza", then that may look like a bit of a spammy artificial attempt to manipulate ranking. And just expanding that point, all good SEO looks natural. If some SEO advice seems awkward or makes for a more clumsy user experience in an attempt to boost rankings, it's likely wrong, or at least outdated.</p> <p>Inbound links from .gov and .edu top level domains are particularly good, those TLDs are considered particularly trustworthy (because the government never lies)</p> <p>Old links are also good. A link to Tony's Pizza that'd been around for four years is better than one created last month. And the age of a domain name is a factor in determining credibility, both of the domain itself (eg, and the age of domains that link to that site.</p> <p>Next, reciprocal links. You might've heard they can be an issue. Let's pretend we're google bot, who's very pragmatic, and consider reciprocal links. If SMH linked to Tony's pizza from an article, and Tony's site linked back to that article, saying "check it out", is that OK? Of course! Should that devalue the SMH link in any way? No way. You'd absolutely expect sites to have a certain percentage of reciprocal links.</p> <p>But what about this situation… imagine if 80% of all inbound links to Tony's were reciprocal links. Does that look natural? No. That looks like Tony's gone around saying "I'll scratch your back if you scratch mine". So a high percentage of reciprocal links can be detrimental to SEO.</p> <p>I'll move on from links now for lack of time, but the main thing is: quality over quantity, and it should all look natural.</p> <p>I'll mention a few other gotchas and common SEO questions:</p> <p>Hidden text. Can google see it? Does it think there's anything wrong with it?</p> <p>There are certainly legitimate uses for hiding text, such as tabs, expandable sections, etc. And as long as the hidden content isn't pulled in dynamically via ajax (and nowadays possibly sometimes even if it is), google can see the content, and it will index it.</p> <p>Let's say I google for how to make pizza dough. And imagine Tony's pizza has content about that, hidden by default on tab number 2. Google can see that content. But is that the resource I want to see first in my search results? Probably not. I'd probably rather see a page that has the pizza dough content visible and in plain view by default. So because hidden content is significantly less important in the user experience, google places significantly less weight on it accordingly. In terms of ranking well for "pizza dough" searches, opting for separate pages rather than tabs may well have been a better option.</p> <p>Likewise, in-page links, like you might use on single page sites. Is google aware of them? Yes. It'll be able to tell that the bits you link to are important bits of the page. But is the best resource on a topic likely to be a single page site? Probably not. If SEO is a critical part of your strategy, you're better off having a substantial amount of content, linked to across several pages.</p> <p>And note again that this is all stuff that very much appeals to common sense. So SEO isn't nearly as mystical as it may first seem, and if you do get SEO advice that seems mystical, or that just seems a bit spammy, deceptive or low class, that's a big red flag. Being a genuinely great resource is key.</p> <p>Writing copy. When writing content for your site, you should think about the types of terms people will be searching for when you want them to find your content. And, you should make an effort to include those terms in your copy. Of course, making sure to do so in a natural, non-spammy, non-awkward way.</p> <p>Furthermore, once you think you know the terms people will search for, you should use tools like Google Trends to check, and try to find more popular synonyms or related search terms. For example, if you're an airline, optimising your site for the term "discount airfares" is big mistake. As the blue line shows, no one searches for that. They all search for "cheap flights" instead, which is shown in red.</p> <p>Coming back to our pizza dough example, Google Trends tells us that "pizza dough recipe" is a far more popular search term than "homemade pizza dough" or "how to make pizza dough", and we should keep that in mind when writing our copy.</p> <p>While I'm talking about search terms, sometimes people get unnecessarily caught up in a single "trophy phrase". For example "I don't care about any other terms, I just want to rank #1 for best pizza in Sydney". That strategy is almost always a mistake. People search for lots of different terms, and of the millions of queries google receives per second, about 15% are completely new terms, never search before. There's this idea in SEO of optimising for "long tail keywords", that is longer, more specific, less frequently searched for terms. The "long tail" idea is a visual metaphor for the search terms vs frequency. If you could imagine every search term ever lined up along the x-axis, and the number of times it's been used plotted on the y-axis, the graph would look like this</p> <p>It looks like most searches come in that green area, of very frequently searched terms. But actually, that tail goes on and on and on. The majority of searches come from the long tail, from the humungous number of infrequently searched for terms. And that's why putting too much attention on a single trophy phrase can be a bad idea.</p> <p>I'm probably over 15 minutes at this point, so I'll wrap up. The cool word to use these days, so you can all feel hip, rather than SEO, is inbound marketing. Kinda like how the term UX has replaced design. Inbound marketing is basically an acknowledgement of the fact that your site gets inbound traffic via lots of source, not just google searches, so it's important to have a holistic strategy that thinks about social media, paid ads, natural search, driving recurring traffic, email marketing, even driving traffic from offline sources and so on, and depending on your business model, you may proportion your time and resources differently across those different areas.</p> <p>So that's all I've got, happy to take any questions.</p>