Thanks for watching my MozCon presentation! Even though it was only 30 minutes, it took me years for me to get to this point. I’m really excited to share concepts that should help you cut out boring tasks and focus on more important things.

Before you start reading, I want to say that programming is not a requirement in Marketing.  However, the truth is that you don’t need a lot of programming knowledge to unlock powerful automation.  

Here’s an honest breakdown and a few links to some of the tutorials.  Everything you need is covered here but you can always email me at [email protected] or [email protected] if you get stuck.

1) On low/no code automation

It’s not new but like most things, I think it’s about timing.

There’s a bunch of RPA (robotic process automation) software out there. Before you jump into a product you should evaluate the software.  I found that some of them were expensive or had a really  high learning curve, so..  

My advice is just to experiment with free software like first and try to define a few processes first. The n8n tutorial is covered in point 11 of this email.

If you can’t do it alone, get help. I’m happy to talk through your ideas and can also suggest a few people.  This is no different than the challenges of programming – it’s more about figuring out the problem you’re trying to solve than the tech/code.

2) Keyword Classification using 

Make your own model, fast and easy using I used Britney Muller’s classification based on her WBF

Create a new application and if you want to use the same structure as I did, just import this file.

This is pretty easy, add your own entities and train the model. Experiment, make mistakes, learn.

3) Using programming languages to perform GET HTTP requests, or no code solutions

I used cURL in the presentation twice, but it’s not a requirement – I just wanted you to see what was happening under the hood. cURL is native on Mac and Linux, for Windows you’ll need to follow this

This is all a matter of preference, you can use no code solutions like Postman ( which, by the way is fantastic.

The request isn’t the hard part, you’ll pick it up pretty quickly.  API documentation and parsing data is the hard part.  Developers are notoriously bad at explaining their genius through documentation, examples and tutorials – it’s OK, I understand how annoying difficult this can be.

Once you’re comfortable getting data from APIs (or sending), the next step is to do things with it. I would suggest Python or JavaScript but it doesn’t mean you can’t use another language.  Those are just my preferred choices because there’s so much documentation on the web and lots of SEOs/Marketers are already writing quite a bit of code in each language.

Code from the deck to grab classifications for keywords in your application:



It’s best to ‘fork’ my examples because there might be dependencies

4) POST requests

Same as point 2, except sending data. You just have to experiment with

code samples, don’t be intimidated.  Here are the examples from the deck:


Node js

It’s best to ‘fork’ my examples because there might be dependencies

5) APIs worth watching

I mentioned these two during the talk:

Also, check out:

6) APIs in general

I thought this was a good API tutorial in general

You already have the code in point 3 & 4 to put this together and to go deeper. So here’s my advice, use the cURL links in point 2 and 3 to get yourself started, play with the commands and make mistakes. Next, move onto JavaScript or Python or any other language to perform a GET or POST request and then try parsing the response to get what you want.


JavaScript Object Notation, this is the same syntax as our JSON-LD schema markup so you’ll probably feel right at home when looking at the code.

Here’s a good video tutorial: 

And here’s a good blog post:


The full instructions to get the same API I demonstrated in the talk are in the video. I wish I had more to say but  is just built really well and doesn’t need further explanation.

8) Data transfer ETL

Extract, transform, load = ETL.  That’s what you’ll Google to figure out how to transfer data from A to B.  There are lots of good providers out there, is my favourite but I’ve also had a great sales experience through was also pretty slick and so was  

Evaluate cost,value and error reporting for yourself. Error reporting is especially important if your ETL pipeline is responsible for mission critical decisions. Evaluate connectors too, don’t assume that these platforms are magically going to deliver

perfect data to your data warehouse, they are limited to the data provider.  For example, GSC API data is notoriously unreliable, that is to say that it will rarely match what you see in the interface for a multitude of reasons.  So, it won’t be your ETL provider at fault, it’s your source.

9) Browser automation

This stuff is hard.  There’s a bunch of software out there that I personally don’t really like, some have their own ‘languages’ and the learning curve can be quite high. You’ll likely have to install something locally which will have a direct connection to the software and who knows where that data is going.  

I demonstrated during the presentation and IMHO is the easiest way to execute browser automation.

Here’s the tutorial video:

If you’re hoping for more customization, use Puppeteer. It requires code, and honestly, it’s not that easy. You’ll need to know JavaScript and there’s a bunch of weird quirks you’ll encounter along the way. Stick to for simple automation and when you really know what processes you want to automate. If you need more flexibility or privacy,  move to headless browsers and specifically detail each step (use puppeteer recorder to help:

10) No code machine learning through

As if mathematics weren’t hard enough, most machine learning is deeply entrenched in Python code (but not exclusive to).  This is exactly why I personally use so I can focus on results and math rather than the environment and code.

Here’s the tutorial video I used in my talk:

Here’s the association discovery tutorial from Bigml:

I love I love their platform, their employees, their service.  I feel the same about as  These are kick ass companies that I would support and love to work with. How’s that for a recommendation?

Bigml offers a free plan with tasks up to 16mb which is more than enough to  experiment with.  I use bigml for text classification, traffic predictions, anomaly detection and as of recently, association rules.  It’s great and if you trust me I will tell you it’s worth your time not only for the platform itself but the people behind it.  

True story, I used bigml in 3 presentations over 6 years and each time I had personal help from engineers or VPs (thank you – they are good people and this is a good company. Invest if they go public 😉

11) Deploying N8N (it’s like your own IFTTT)

Like I mentioned in the presentation, I evaluated every automation platform I could.  I chose n8n because it was free and friendly to use.  Hang on though, I need to tell you that I know how to code in JS and that felt like a prerequisite to do things properly.  I don’t see anyone escaping the learning curve to be honest. There are so many cool visual programming type platforms out there like but it requires a lot of time to learn properly. Anyways, here’s n8n:

Watch the tutorial:

Here’s the github repository with the one-click deploy:

This part is really important:

Once you set all that up, if you’re using the free version of Heroku, you need to keep it ‘alive’.  This means you need to keep it awake or your automations won’t run.

Create a new automation workflow with a single webhook (setup here:  Get the webhook url and ping it every 15 minutes using

12) The Moz API for Sheets add-on

You can install it here: 

The blog post will be up on the shortly and it will explain a few extra features.

We hope you like it and please let us know if you run into issues, feel free to tweet at me directly @dsottimano

13) Automation ideas

Here’s few ideas for mini projects you can try today:

Take screenshots of webpages

Get web vitals metrics from PSI

Classify web pages

Understand images before seeing

On demand single page crawl

Extract audience intelligence through Twitter (Process by @richardbaxter)

Send data to data warehouse

Get expiring domains list and email me asap


Thanks again for watching the presentation and feel free to contact me if you’re interested in automation, maybe I can help make things easier for you.