Set up Laravel 5.4 with MAMP Pro on MacOS Sierra

I regularly need to work on WordPress sites locally on my laptop, and for me nothing beats the convenience of MAMP Pro. I’ve also been working on some projects in Laravel, and so needed to add that to my local set up. There are many ways to do this, including Valet and Homestead, both of which are provided by Laravel. However, I didn’t need the power of Homestead and the last time I tried setting up Valet alongside MAMP I ended up with a mess of conflicting versions. In trying to unravel that mess I pretty much totalled my Mac install.

After wiping and reinstalling my Mac, I decided that I would try setting up Laravel to work with MAMP Pro. I couldn’t find an up to date set of instructions that covered every step, so I’ve put this here for my own future reference and in in case it’s useful to anyone else.

I’m using Laravel 5.4 and MAMP Pro 4.1.1 on a clean install of MacOS Sierra 10.12.5.

1. Set bash to use the MAMP version of PHP

The version of PHP that comes preinstalled on MacOS is out of date, so the first thing to do is to make sure that we are using the version installed with MAMP.

export PATH=/Applications/MAMP/bin/php/php7.x.x/bin:$PATH

Replace 7.x.x with the version of PHP you want to use in the terminal.

2. Download and install composer

Laravel uses Composer, so we need to downlad and install that:

curl -s http://getcomposer.org/installer | php

And then move it and set up an alias (see note at the end of step 4). I’m using nano, but use your editor of choice.

sudo mv composer.phar /usr/local/bin/
nano ~/.bash_profile

Add the following to the .bash_profile file.

alias composer="php /usr/local/bin/composer.phar"

Restart the Terminal, and you can run Composer by typing:

composer

3. Install Laravel using Composer

Now we can install and set up Laravel.

composer global require "laravel/installer"

Note: On a clean install, you will be prompted to install the Xcode command line tools (if you haven’t already done so).

4. Make the laravel command available

So that the system can find the Laravel executable we need to run:

echo 'export PATH="$PATH:$HOME/.composer/vendor/bin"' >> ~/.bash_profile
source ~/.bash_profile

Note: At this stage trying to use the laravel command produced an error saying that composer could not be found. Following advice in this thread on Laracasts I renamed composer.phar to composer and updated the alias.

sudo mv /usr/local/bin/composer.phar /usr/local/bin/composer
alias composer="php /usr/local/bin/composer.phar"

5. Create a Laravel project and make it available via MAMP Pro

Create a new Laravel project

cd ~/Code
laravel new blog

In MAMP Pro we create a new host (I chose blog.dev) and then point the root to the public folder inside the Laravel project. I also chose to let MAMP Pro create a new database named ‘blog’.

MAMP Pro hosts screen showing blog.dev set up

Start the MAMP Pro servers and the site should be accessible at http://blog.dev

A blank Laravel app running in Safari

6. Using MAMP’s MySQL

If we want to use a database with our app we need to add the details to the .env file:

DB_CONNECTION=mysql
DB_HOST=localhost
DB_PORT=3306
DB_DATABASE=blog
DB_USERNAME=root
DB_PASSWORD=xxxxxx

Note that we’re using localhost rather than 127.0.0.1. If you want to use 127.0.0.1 you need to set MAMP Pro to ‘Allow network access to MySQL’ on the MySQL settings screen.

The MAMP MySQL settings screen

When you run php artisan migrate you may see this error:

[PDOException]
SQLSTATE[42000]: Syntax error or access violation: 1071 Specified key was too long; max key length is 767 bytes

This seems to be caused by the version of MySQL that MAMP Pro uses. Following advice in this thread on Laracasts I added the following to AppServiceProvider.php:

use Illuminate\Support\Facades\Schema; //Add this to the top of the file

public function boot()
{
Schema::defaultStringLength(191); //Add this line to the boot function
}

If we delete the tables created in the previous attempt and then run php artisan migrate again, it should now work and you should see a message similar to this:

Migration table created successfully.
Migrating: 2014_10_12_000000_create_users_table
Migrated: 2014_10_12_000000_create_users_table
Migrating: 2014_10_12_100000_create_password_resets_table
Migrated: 2014_10_12_100000_create_password_resets_table

Now that everything is set up we should just need to follow steps 5 and 6 each time we want to create a new Laravel project.

Dropping the Tech Giants

There was a nice simple interactive article published in the New York Times last week titled Which Tech Giant Would You Drop? It links to a much more in depth piece on how our lives are increasingly dominated by five tech giants – Alphabet (Google), Apple, Microsoft, Facebook and Amazon.

It then poses one simple question – if you were forced to, in what order would you give up these companies?

Here is my answer:

  1. Facebook – This is simple because I’ve already done it. I’ve never used Instagram, didn’t find Whatsapp useful and deleted my Facebook account earlier this year. I don’t miss it at all.
  2. Amazon – Although this global market place is convenient, it doesn’t offer me anything that I can’t get elsewhere. I wouldn’t miss the shopping side too much and I definitely wouldn’t miss Amazon Video (what they offer is of little interest to me). That said, I would miss my Kindle.
  3. Alphabet/Google – I don’t use as many Google services as I used to, but some are just that much better than their competitors that they are hard to give up. No other mapping service comes close to Google Maps  for accuracy, and although I prefer Vimeo, I find more useful content on YouTube. My default search engine is Duck Duck Go, but  some times I still need to use Google to get more relevant results.
  4. Microsoft – The giant from Redmond would be hard to do without. I continue to use some software that only runs on Windows. I also use Office every single working day, because it is the tool that all of my clients use.
  5. Apple – I’ve used Apple products for more than a decade and although I’ve been less impressed with them more recently than I used to be, their products are so embedded in my work and personal life that they would be incredibly difficult to replace.  Although I could replace all of the Apple products and services I use (and I have considered this), the almost seamless integration between everything is just too useful.  There are also some Mac and iOS apps that have no alternatives of equal quality on other platforms. I’m thinking of apps like Ulysses, Alfred and DevonThink.

This was an interesting exercise that made me realise a few things. Although I’d been on Facebook for nearly ten years, leaving was incredibly easy and I haven’t missed it once. I’ve used services from Amazon and Google pretty much by default, and have only recently taken the time to think about the value that I get from them and the cost of that.

 

I have also been less satisfied with Apple now than I was a few years ago, and I have looked for alternatives. This exercise made me think carefully about how I actually use these products and services and whether I would be better served going elsewhere. I’m now sure that I wouldn’t.

I think we should all be more conscious of the technology that we use, and regularly review our choices.

Micro Learning Tools

On 11th May, along with my colleague Clive Shepherd I presented on the subject of microlearning at the CIPD L&D Show. One topic of considerable interest was the tools that you could use to create content. Here, as promised to those who attended is a non-definitive list of tools. If you have any tools you’d like to suggest, please add them in the comments.

How-to-videos

Adobe Premiere

Final Cut Pro

Quik

Screencasts

Screencast-o-matic

Snag-it

Camtasia

Adobe Captivate

uPerform

Infographics

PowerPoint

Piktochart

Canva

Visme

Venngage

Adobe Illustrator

Affinity Designer

Scribus

Inkscape

Explainer videos

PowerPoint

Prezi

Powtoon

GoAnimate

VideoScribe

Adobe After Effects

Quizzes and games

Qzzr

H5P

Moodle quizzes

Articulate Quizmaker

Web articles

Any web editor e.g.

WordPress

Drupal

Sharepoint

Sway

Interactive lessons

PowerPoint

H5P

Articulate 360

Adobe Captivate

Elucidat

Apple – style over substance?

I’ve been a Mac user for close to ten years, but for the past six months I’ve been splitting my time pretty much equally between a Macbook Pro and Surface Book. So as someone with a foot firmly in each camp I was very interested in the Microsoft and Apple events last week. In the few days since those events there has been a lot of comment, much of it highlighting the innovation coming out of Redmond and the perceived lack of it coming out of Cupertino and in particular Apple’s lack of support for the professional market.

I don’t intend to add a huge amount as so much has already been said, but I do have a view.

As a long time user of Apple products I have often rejected accusations that they are all about style over substance, but I no longer think that’s true. I read two posts on Daring Fireball this morning, in which John Gruber summed up Apple’s approach.

In the first he said:

Apple simply places a higher priority on thinness and lightness than performance-hungry pro users do. Apple is more willing to compromise on performance than on thinness and lightness and battery life.

And in the other:

But the price you pay for the MacBook Pro isn’t about the sum of the components. It’s about getting them into that sleek, lightweight form factor, too. In a word, Apple is optimizing the MacBook lineup for niceness.

If that isn’t a description of style over substance I don’t know what is.

I think this is an indication that Apple is completing its transformation from a computer company to a consumer electronics company. That’s fine – it’s been a very successful strategy for them – but it’s time to accept that’s what they are and stop pretending that they’re the natural home of creative pros.

Most of the time I spend in front of my Mac or PC is focussed on creating things (words, images, videos etc.). I’ve been leaning more and more to the Windows machine lately as it seems to be more reliable, it’s noticeably quicker and I find it just plain nicer to use – and I don’t think this is by accident. It really does appear that Microsoft are more interested in the pro market than Apple are, and they’re making the hardware and software to support them.

Welcome back Opera

Back in the early 2000s I used to use the Opera web browser. Then, as now, it was eclipsed by its rivals and never really got the adoption it deserved. I liked it it partly because it was a really good browser, but mostly because it was the underdog competing with the heavyweights at Microsoft and Netscape.

Ultimately I left it for Firefox and since the  have been a regular user of pretty much every browser you’ve heard of – and some you probably haven’t (Midori anyone?).

Happily I’m now back using Opera as my daily browser. Why?

  • It’s based on Blink – the engine that powers Google Chrome – with all the speed and other benefits that brings
  • It supports Chrome plugins
  • It’s not a Google product and doesn’t track me
  • I use Windows, Mac, iOS and Android and Opera gives the best cross platform experience

I can recommend Opera as an alternative to any of the mainstream browsers.

Specialization, Polymaths And The Pareto Principle

Reading this article on Techcrunch I was struck by two things:

Deep expertise is less and less useful

If you consider just two things – the pace at which we increase our understanding of how human beings learn and the pace at which the technological environment around us is changing – its seems obvious that L&D should be a fast moving field.

The reality is usually different, with people who work in L&D investing their time and effort in developing deep expertise in very narrow topics. Often tied to qualifications that are rooted in the past.

As an industry we would surely benefit from us all having a more diverse skill set. To use the terminology from the article, we need more Journeyman than Masters.

It isn’t just the tech industry that needs more polymaths

The most exciting and impactful projects that I work on are those driven by cross disciplinary teams that work together throughout the project.

They work because at least some of the people in those teams have knowledge and skills that crosses multiple domains, not just the one attached to their job title. They play a key role in helping people communicate and share ideas.

Being experts in learning is not enough, nor is just ‘talking the language of business’. We need people with diverse skills that are relevant to the organisations in which they work.

 

On the long slow death of Twitter

I came across this great cartoon in The Guardian over the weekend. It sums up much of what I think about Twitter – except for the final frame.

The cartoon uses the analogy of Twitter being a bar and concludes by wondering if “maybe it’s time to find another bar.” For me, Twitter feels more like that bar you went to at a certain time in your life, but now you have other things to do. You look back at it with good memories, but you don’t need to replace it.

Learn xAPI MOOC – Week 4 Reflections

Here are my thoughts after completing week 4 (the final week) of the Learn xAPI MOOC. This week is titled “Final Challenge and Post Conference Drinks” and it brings together the technical and strategy tracks.

There are three key questions:

Discussion Point: Learning analytics is too important for L&D to own

I was surprised (and chuffed) to see that the first question was inspired by the blog post I wrote on 28th May. This was a response to a post on learning analytics written by Mark Aberdour.

There were great comments from everyone and I’m really pleased to have contributed to the debate.

Due to semantics (or perhaps just me not being clear in the first place) there were some comments around responsibility and accountability. To clarify – I believe 100% that L&D needs to take ownership for results and to some extent that will include learning analytics.

My point was more that if organisations are going invest in employing data scientists (as Mark suggested in the original post to which I was responding) they need to do their work at a level above L&D (and every other department). If the measurement and the analysis of the data remains within L&D there is a very high risk that all we end up measuring is our own performance in the context of our own measures (was it a good piece of learning material?) rather than the impact we are having (did it make any difference to workplace performance?).

L&D certainly needs to be involved involved in learning analytics. The alignment of learning analytics to performance outcomes is something that should begin at the earliest stages of design.

One of the hardest part of analysing any data is working out what it actually means. I’m less sure that L&D should be the ones who decide what the data means – that’s where a data scientist (or whoever) looking at this at an organisational level comes in.

Discussion point: Most L&D teams lack the skills and mindset to make effective use of meaningful data. Do you think L&D teams have the potential to develop the necessary skills and will their organisations give them the opportunity to develop them?

This is the question that I was asking myself at the end of week 2.

First of all, this isn’t a question of capability – I expect any good L&D team would be capable of developing the skills. To what degree they should develop these skills will probably depend largely on the size of the team (in small teams it may be more about mindset and understanding rather than deep skills).

Whether they will have the opportunity is harder to answer and this will probably depend on how they are perceived by the organisation. My thinking has moved on a bit since week 2. Back then I was wondering if L&D will be given the opportunity. Now I think that opportunity isn’t something they should wait for – they should own this and get on with developing those skills.

Discussion point: How do we make sure that we don’t get carried away with what’s possible and instead focus on what is valuable? And lastly who is it valuable to?

I think these are the big questions and the ones that I’ve been thinking about since week one.

Focussing on the valuable rather than the possible must be one of the most common challenges facing anyone who works with technology and learning. The only way we do this is by understanding what is valuable to the organisation.

That means going beyond being order takers who simply satisfy the needs of the immediate stakeholder who comes to us for a solution. We need to be able to consider everything we do in the context of the organisation’s goals and, when necessary, challenge the stakeholder if they demand the wrong solution.

The rest of the week’s content was focussed on practical activity around, visualisations and telling stories with data.

Summary of my thoughts after four weeks

I haven’t reached a definitive conclusion about xAPI, but that wasn’t my expectation, and my ideas will continue to evolve. Here is a snapshot at week 4:

  • I’m interested in xAPI in that it changes the technical tools we have for measuring activity. If we really need to track activity, xAPI goes beyond many of the limitations of SCORM (such as tracking activity on mobiles and in apps).
  • Whether or not we need to measure activity is another thing entirely. I’m not against measurement, but it needs to be the right measurement and it needs to be actionable.
  • I’d be surprised if we really see many organisations tracking anything other than activity.
  • I fear that clumsy attempts to use xAPI too widely will degrade the experience for learners.
  • After four weeks I still can’t see how the learner benefits from xAPI. The benefits are to L&D and maybe to the wider organisation if they get the data analysis side of it right.
  • xAPI tracks activity (albeit a wider range of activity than SCORM) but it does not track learning.
  • The name is terrible. Domain specific acronyms make communication difficult and put people off by making them outsiders. See here.
  • People seem keen to use it outside L&D, but I think that may be difficult given that it is designed by L&D people, for L&D people to solve the problems that L&D people have.

Learn xAPI MOOC – Week 3 Reflections

Here are my thoughts after completing week 3 of the Learn xAPI MOOC.

The strategy track for this week is titled “Data and Learning Analytics”. I’d already been giving this some thought (see here) as I have concerns about the real benefits of even the simplest tracking and measurement, let alone ‘learning analytics’…

This week started off by addressing the difficulty that people often have in grasping the difference between correlation and causation. This was done by sharing a page from Explain xkcd (which is just brilliant!). I’ve bookmarked this for future use.

There was only one discussion point that interested me this week:

Discussion point: What sources of data could you be tapping into to build a better picture of the connections between training and performance?

There are, I think, only two possible answers:

  1. If you are considering this at a high level, then the answer is simply “whatever sources of data the organisation uses to measure performance”. So if you want to know if your customer service training programme has made any difference, you have to start by looking at whatever data source the organisation uses to measure customer service. Good luck with finding anything other than a superficial link.
  2. The alternative is to consider this at a much lower level . For every training intervention we should be identifying the intended performance outcome at the very earliest stages of design. At that stage we should also be identifying what data we need to measure that performance and from where to source that data. If that data source doesn’t already exist, there’s a good chance that’s because what you intend to change isn’t important enough to measure.

The rest of the week’s content was focussed on analytics, data, visualisation, privacy and other general data related subjects. All interesting topics, but nothing I hadn’t already given a great deal of thought to.

Summary of my thoughts after three weeks

I’m still trying to make sense of xAPI, and after three weeks here are the main threads I’m considering.

  • Before starting the MOOC I suspected that most of the talk about measuring “learning outcomes” actually referred to measuring “training outcomes”. After three weeks on the MOOC, I’m now sure that’s true.
  • I worry that there is more interest in using xAPI to prove a link between L&D and performance in order to demonstrate the value of L&D rather than to actually improve workplace performance.
  • Just as I did at the start I can see how tracking activity may bring plenty of benefits to L&D and pretty much none to learners.
  • In theory we could use xAPI to track a huge amount of activity, but I haven’t seen any compelling argument as to why we should. There seems to be an underlying assumption that measurement is a good thing (you only have to think about school league tables to know that it isn’t).
  • One of the more pertinent things I’ve read on the subject recently is this post from Henry Mintzberg. I’ve seen a lot of “measuring as a replacement for managing” within L&D.

The biggest worry I have is that in trying to measure too many things we will actually degrade the experience for learners. There are a few ways this could happen.

  1. We identify specific activities or content that we want to track and to make that easier to do we move them somewhere they are easier to track – you know, like an LMS.
  2. We give the learner some kind of tool (like a browser bookmarklet or app) and say “each time you learn/do/experience anything, just click this so we know about it. Oh, and make sure you pick the right verb.”
  3. We find some diabolical way to track everything they do and then analyse it in search of learning activity. Seriously, stop thinking about that right now.

Learning analytics is too important for L&D to own

I’m clearly not the only one to recognise that L&D lacks the skills needed for the kind of learning analytics enabled by advances such as xAPI.

Mark Aberdour has written a very thoughtful post about the challenges we face and makes this suggestion:

Clearly some of these items require close interaction with the L&D team, but in summary there is a real need to bring experienced data scientists into corporate learning and development, not just to set up analytics programmes but to continually monitor, review and refine the results.

via Building a learning analytics platform | Open Thoughts.

I agree that organisations need people who can interpret this data and make it actionable, but I don’t believe they should sit within L&D. If ownership for learning data remains within L&D we risk continuing the current situation where all we do is measure the most basic elements of our performance (inputs and outputs) rather than the impact of learning on workplace performance.

For learning data to have strategic value it needs to be considered at a higher level, in combination with data from other parts of the organisation. To be objective, ownership for this needs to sit outside any department with a vested interest in the results.