Category Archives: analytics

How intelligent is AI today?

I recently attended a workshop by Arts Impact AI, which is undertaking conversations on AI across Canada. I discovered quickly that my expectation of what Artificial Intelligence (AI) is, wasn’t quite in the right place for the conversation at hand. I expected the discussion to centre around intelligent machines thinking and working similar to humans. Where attributes like self-learning or the ability to intelligently change its programming based on new input would be explored.

Algorithm Making

We spent the morning considering algorithms capable of rapidly analyzing vast amounts of data. An intuitive example came in the form of a group exercise where group 1 developed an algorithm (five characteristics based on a set of 12 images of convicted criminals) to identify the most likely criminal in a crowd, group 2 – the computer – applied the algorithm and group 3 – the humans – were tasked to simply identify the criminal without an algorithm. My colleagues in group 1 –  which was made up of people from diverse backgrounds and ethnicities who live on the traditional territories of self-governing First Nations in the Yukon (and yes, that might have mattered to our decision-making) – opted to select criteria that did not include racial stereotypes. Needless to say, we broke the machine.

Each group had serious struggles with the ethical implications of their group’s role. This was the point, of course: do the designers of algorithms simply reinforce the stereotypes based on a highly biased judicial system that disproportionately affects Indigenous people and people of colour, and often men that are visibly part of these groups; or do they write an algorithm that does not fall into those stereotypes but focuses on other aspects.

Big Data Analysis

In my way of thinking this kind of AI application lives in the realm of big data analysis. While I imagined AI to feel unfamiliar and new, this felt extremely familiar to me: As a market researcher, I have followed for years work on “big data” analysis and how with the aid of faster computers our ability to analyze truly vast data sets has increased many fold. The biggest advantage, indeed, being speed that cannot be matched by a single human brain.

The AI application this group exercise mirrored is based on the analysis of a vast amount of data, e.g. 10,000+ photographs of convicted criminals, using computer facial recognition. This analysis identifies statistical probabilities for the parameters that were set.  Those probabilities are then used by humans to program an algorithm. That algorithm seeks to identify people in large crowds that match the analysis. By definition, this kind of analysis is looking to the past to inform the future; or in this case, to become the future.

Ethical Dilemma

The humans who build such algorithms  – which itself is void of AI self-learning or the acquisition and application of new information and capacity – determine their outcome.

When these humans do not apply a greater understanding, or an ethical lens (related to systemic impacts of oppression of certain groups in society, for instance) to the parameters analyzed in the first place, or to the resulting statistical probabilities, they are bound to create algorithms that reinforce the systemic biases evident in society.

In short, they may miss a lot of criminals and identify a lot of non-criminals. In so doing, they may also ensure that more of the same groups of people are pursued with the government’s righteous rigour, resulting in higher incarceration rates for these groups. Rather than discover what is real, it perpetuates a seriously biased reality that increasingly would disadvantage specific groups. The past literally becomes the future.

AI governance as data governance

This discussion of what algorithms are today centred on big data and what we can and should do with it was fascinating. Alas, it didn’t paint a picture for me of artificial intelligence in the sci-fi sense.

In any case, as a result of this data focus, the AI governance discussion was heavy on data governance, i.e. the collection, storage and use of personal data. Personal Information Privacy and Electronic Documents Act and  provincial laws govern information that is identifiable to an individual already. Canadian Anti Spam Legislation tries to combat spam and other electronic threats. There is a Do Not Call List to regulate how landlines can be used. These legislative tools tend to deal with a specific technology. This approach leaves much grey and blank space as companies explore and create more advanced technological innovations.  Simply put, technology changes more rapidly than laws.

In the end I feel it is this conundrum that AI governance should address – to move away from regulating one specific technology at a time to contemplate the notion of privacy and social licence we wish to adopt in our society.

Definitions of Artificial Intelligence

Artificial intelligence (AI) is an area of computer science that emphasizes the creation of intelligent machines that work and react like humans.

Some of the activities computers with artificial intelligence are designed for include:

  • – Speech recognition
  • – Learning
  • – Planning
  • – Problem solving

[Source: Technopedia]

4 Types of AI

  1. Reactive machines – e.g. Deep Blue chess playing machine
    • Reactive machines have no concept of the world and therefore cannot function beyond the simple tasks for which they are programmed.
  2. Limited memory – e.g. autonomous vehicles
    •  Limited memory builds on observational data in conjunction with pre-programmed data the machines already contain
  3. Theory of mind – e.g. current voice assistants are an incomplete early version
    • decision-making ability equal to the extent of a human mind, but by machines
  4. Self-awareness -so far only exists in the movies
    • Self-aware AI involves machines that have human-level consciousness.

Source: G2

Cross-posted on https://digitalartsnation.ca

Leadership matters: Reflecting on the Yukon Arts Summit

My mind keeps returning to the Yukon Arts Presenters Summit. I had the rare benefit of debriefing with Michele Emslie, Summit organizer and Community Programming Director at the Yukon Arts Centre, over a few days and assisting in reviewing the personal and group action plans to which participants committed.

I am struck by the leadership capabilities that underpinned the success of the summit; qualities that go beyond being adaptable or seeking to be relevant to stakeholders.

Design thinking applied

Rather than define and solve a specific problem, the organizers held themselves to a different standard based on a broad goal: strengthening the Yukon arts presenting eco-system. Making such a broad goal central meant that much effort was spent on creating the conditions in which participants could discover and define the actions that were important to them. At heart of this design thinking approach lies understanding that a combination of empathy, creativity, analysis and synthesis as well as having explicit spaces for convergent and divergent thinking are essential. In short, by taking this approach, organizers succeeded in creating a space in which a diverse group of participants could learn, reflect, be inspired, meet and talk together and arrive in new places together.

Co-creating an intentional journey

There was no pre-defined destination, no agenda in terms of specific outcomes, no boxes to check off, no need for linear progression. Rather, there was an invitation to join together on a journey of discovering common ground and action priorities.

The organizers were focused on empowering participants from the start, knowing that the summit is its participants. They asked potential participants to co-create the content through soliciting feedback on hot topics and burning issues. 60 responses came in! Organizers listened carefully and found five key themes to address. An important  effect of this open, listening approach was that the tone of the summit, its ownership was already in the hands of participants well before they could even register for it.

Deep respect and trust in each person’s wisdom

The organizers showed a deep, easy respect for each person and their knowledge and experience. This was apparent in every facet, including activities like:

  • The Friday morning networking exercise using a photo, paper and markers to answer four questions: who are you/ what do you do, what is your hope for the future, what can you contribute to the summit, what do you need from it.
  • A gift exchange: each participant was asked to bring a gift that represents something about them, their work or their community (using their imagination rather than pocket book). These gifts were randomly distributed at lunch and then everyone read the brief note that was attached by the giver and talked about what significance this gift held to them. There were all kinds of wondrous giver-receiver match ups and the exchange made for a profound sense of connection and some fun. 100+ people managed to share in plenary over lunch while staying on schedule for the entire conference.
  • Each day’s opening reflections, ranging from an elder’s prayer to Haiku to Gramma Susie.

Wisdom comes from many places and, in particular, the spaces in between.

Action-oriented

The summit schedule was action-packed, not because of featuring talking heads or experts, but because of its focus on facilitation, conversation, meeting and thinking together, and action planning. (As a speaker, I felt I was well briefed heading into the summit!) As a result this summit produced several big ideas and actions through collaboration, rather than consensus. Perhaps most important, it resulted in the ownership of these ideas residing within the community itself, owned by various champions and those who gathered around these big ideas. Conference organizers didn’t get a long task list back, but rather received a strong mandate to remain stewards of the process, facilitate the next steps and to continue leading by encouraging leadership from within the arts community.

Using Open Space methods, participants pitched these initiatives for discussion.

Using Open Space methods, participants pitched various initiatives for discussion and to see which ones were strong enough to warrant concerted action.

A network = An action community

I believe we are seeing a profoundly different kind of arts presenters network emerge in Yukon. Not one that becomes a membership-based service model over time and that might suffer the eventual difficulties that have become so well documented for many membership-based associations; but a living, breathing, creative community that gathers around common actions (which require a just large enough group to be interested in working together), that is highly responsive to emerging and changing needs, and that delegates authority to all participants while benefiting from unhurried and effective stewardship provided by the Community Programming Director at the Yukon Arts Centre (YAC). Finally, YAC is ideally positioned for this role as it is a territorially created arts centre whose mandate includes strengthening arts as an important cultural, social and economic force in the Yukon Territory as a whole.

This close-knit, open network grounded in shared leadership and personal commitments, will show us how big ideas can be realized through concerted actions – unfettered from needing to establish narrow service priorities or delegating authority to a few (like a board of directors) – and thus able to grow and shift as the situation warrants.

What’s the matter with numbers?

With thanks to CAPACOA for commissioning my response to the Culture Shock debate entitled  “Hard Facts VS. Proverbial Truths: The Impact of Arts & Culture on Canadian Citizens & Communities” held on November 20, 2014 at the Community Knowledge Exchange Summit.  Moderated by Canada Council for the Arts CEO Simon Brault you can watch the archived livestream here

Billed as #CultureShock, Alain Dubuc, a journalist and economist, and Shawn van Sluys, who heads up a philanthropic foundation that works to make the arts more central to our lives, debated whether “For arts and culture to be fully valued by society, their impact must be demonstrated with hard facts” or whether proverbial truth are sufficient.

The case for telling the stories of transformation and understanding through art was made eloquently. Yet, I was more struck by the economist’s assertion that hard facts are “the best way” rather than “the only way” to ensure we fully value arts and culture.

This debate brought to my mind Daniel Kahneman’s observation in Thinking, Fast and Slow  that humans  have a propensity to believe that “what you see is all there is.”  He cautions us that we can easily miss important parts of a situation because there may be more going on than meets the eye.

And that reminded me of the old adage that what we count is what matters.  By inference that suggests that we actually count what truly matters, and that those things left uncounted do not matter.  In the arts much of what gets counted are ticket sales or attendance as a percentage of capacity. Until recently, little attention has been paid to collecting the stories, let alone data points, of impact and benefits of the arts. In my view, just because some things are (relatively) easy to measure, like attendance or GDP or employment figures, that does not mean that they tell the whole story – or the most important parts of the story. Conversely, just because some things are harder to measure that doesn’t necessarily make them any less important or, for that matter, immeasurable.

Indeed, I think we gain the deepest insights through a purposeful combination of numbers and stories. For numbers are not meaningful by themselves. Numbers require context and an understanding of the intrinsic dynamics at play. In my work as a researcher and strategist, my task is not merely to produce tables and analysis, but to interpret findings and create meaning. It is this highly creative process of meaning creation and collaboration with all the decision-makers that can lead to new insight. And in creating meaning we bring the numbers to life through examples: the stories.

Some in the arts do not wish to speak the language of numbers which they equate with the language of business. From my experience working with corporations I know that yes, numbers are important, but many invest heavily in innovation and creativity in order to solve significant problems and improve quality of life through new products and services. The divide is not so great. Rather, we may well be just lacking translators or mediators; people who are proficient in both languages and who can help us understand each other better.

Watch the debate. 

Research in the arts for everyone!

I was invited to present a webinar on the Dos and Don’ts of Research in the Arts this week by Ontario Presents and Atlantic Presenters Association. The full webinar recording and a few downloadable files are available here.

The audience for this session were people working in arts presenting organizations. They are not researchers typically, but use or commission research and may well put together the occasional survey. The focus of the webinar was understanding what value research brings, reviewing a comprehensive research design, exploring briefly major types of research (secondary, qual, quant and data analysis), a few high level observations on sample design and questionnaire design and, finally, legal frameworks that apply to marketing research.

Obviously, each of these topics merits much more time and depth . In fact, this webinar came out of a 1.5 day arts research seminar I conducted for Atlantic Presenters in St. John’s last June. All to say with this webinar I aimed to raise awareness of what we think about when undertaking and designing research and keep it real in terms of practical applications. The hands-on workshop is a whole different level of learning and practicing research and analysis skills.

 

Strategies to grow membership

At a recent board workshop we discussed different ways to look at the association’s membership in order to understand better how to grow it.

I proposed to look at the cumulative number of members over several years for a more complete evaluation. Typically, we look at the total number of members – or subscribers – as an annual figure and then we pay some attention to churn (non-renewing members). Growth occurs when this churn figure is lower than the number of new members acquired, i.e. more people join than drop out. Evaluating churn makes clear why the first task in an established organization is usually retention, keeping members/subscribers year after year. High rates of retention mean that growth can be achieved more readily (as long as you have not captured your entire market);  it also means that your marketing efforts should become more cost-effective as retention should cost less than acquisition.

When we look at a wider time span, for instance 5 years or 10 years, we gain a different understanding of the degree to which an organization has reached and engaged its market. Is the the cumulative 5-year figure very close to the annual figure or is it much larger?

If it is very close then you are basically stable. If you wish to grow in this scenario then you need to focus on acquisition strategies to accelerate growth.

If the 5-year cumulative figure is much larger, then you might need to think not only about acquisition but re-acquisition. Re-acquisition means re-engaging with people who have made up their mind already about the value you provide by rejecting it for some reason. Re-acquisition is quite a different task, requiring different strategies, tactics, messages and channels. Because these people are not a blank slate (they have developed firm beliefs about your organization and have perceptions founded in their personal experience) I think that re-acquisition is fundamentally more difficult than gaining a brand new member, subscriber, customer.

Strategically this dynamic has to be considering in light of your total market potential.

There are times when re-acquisition can be critical to ensure an organization’s sustainability in the long-run. Given the nature of re-acquisition, strategies designed to re-engage likely run their course over 3 to 4 years. The focus then has to shift to true acquisition because those you wish to re-engage either have done so or simply are not going to have their minds changed unless something important, and likely out of your control, changes for them.

In both scenarios, retention driven by creating value and a mutually beneficial and meaningful relationship with members remains paramount.

The trouble with dynamic strategy canvases: Calgary Centre by-election

Calgary Centre is holding a by-election on November 26, 2012. A group of progressive citizens under the banner of 1CalgaryCentre (website and Facebook) has attempted what progressive political parties have been unable to do: create a process to select a consensus progressive candidate. It’s a grand experiment; it is idealistic and optimistic. It suggests doing something is better than doing nothing.

That process returned valuable insights but that’s not what this post is about really. (Yes, Chris Turner running for the Green party won – his campaign was way more effective at turning out the vote online).

Some people have been critical of the lack of “representativeness” of the 1CalgaryCentre process. Well, it wasn’t intended to be representative – or predict an election outcome based on a ‘snapshot-in-time’ poll. Rather it sought to crowdsource a consensus candidate among engaged progressive voters, i.e. people likely to actually cast a vote. That is a fundamentally different objective than the typical election polls reported on in the media as if they could predict the outcome on election day (Yes, horse races can be fun to watch. Yes, voters benefit from having access to non-partisan polling as it shows them a snapshot of what people in their riding are thinking).

1CalgaryCentre is another, a unique data point progressives in Calgary Centre can consider when they go vote.

A worthwhile point relating to random election poll is that, yes, they are based on the science of statistics and yet, they consistently and dramatically over-estimate voter turnout. So, the real question isn’t random polls and whether they are accurate: they usually are within the errors they measure at that moment in time (sampling error is measured by “margin of error”, all others usually ignored in reporting – such as people inaccurately reporting about actually going to vote; it’s human nature: voting is seen as socially desirable, so when Canadians are ask, they are prone to say they will vote.)

Actual voter participation comes in at about 60% on average in recent Canadian general elections; it was 55.6% in Calgary Centre in the 2011 federal election. The random polls published over-estimate turnout significantly. Return-on-Insight reported that 12% said they won’t vote (thanks for the honesty!) and 16% were undecided. Similarly the last Forum poll on November 17 surveyed 403 Calgary Centre residents and 374 indicated who they would vote for, resulting in only 8% indicating they aren’t voting. Matching the actual turnout in Calgary Centre in 2011, the true number of “won’t vote” should be closer to 45%.

In essence this means that  the strategy canvas on which this by-election is happening is highly dynamic: voter turnout alone determines the result. The reason for this is that the random phone surveys suggest that 2/3 of Calgary Centre voters support progressives. It’s actually mathematically possible for progressives to come in 1st and 2nd.

So here’s some math for you:
Scenario 1

2011 voters Nov 17 Forum poll breakout 2012 by-election votes differential to leader
49235 35% 17,232 CPC
49235 30% 14,771 -2,462 Liberal
49235 25% 12,309 -4,924 Green
49235 10% 4,924 NDP
100%
Scenario 2
2011 voters (minus 10%)
44312 35% 15,509  CPC
44312 30% 13,293 -2,216 Liberal
44312 25% 11,078 -4,431 Green
44312 10% 4,431  NDP
100%
Scenario 3
2011 voters (minus 20%)
39388 35% 13,786  CPC
39388 30% 11,816 -1,969 Liberal
39388 25% 9,847 -3,939 Green
39388 10% 3,939 NDP
100%

You see how the lower the turnout, the more important the progressive vote becomes, ie the victory of the poll-leading party becomes ever thinner in terms of number of votes difference.

In essence, the strategy canvas is dynamic and not set (this is not a zero-sum game), so here is another calculation:
Imagine Green party supporters identified by the Forum polls, turn out to vote at 75%, while the other parties deliver on the high end of a by-election turnout:
2012 Forum poll eligible voters 2012 votes
Chris Turner (Green Party) – 75% of his 25% support 25% 88520 16597
CPC (based on votes from scenario 2) 15509
Liberal  (based on votes from scenario 2). 13293
NDP  (based on votes from scenario 2) 4431
New base 55363

That’s right. This would boost the overall voter participation rate quite a bit. Note, this calculation uses two different base numbers. And why not? This is a dynamic field, where variables move based on actual behaviour. It’s within the Turner team’s grasp to win. Needless to say, it’s also within the Liberal party’s grasp to win. Deeper analysis by others suggests that the chances of the Liberal Party campaign team boosting up support is less likely. Still, if both Turner and Locke were to mobilize their vote at high enough rates, Calgary Centre could return the most unexpected: a 1st and 2nd place for progressive candidates.

Confidence is a key to the scenario. People do like to vote for winners. That means, Turner voters have to feel confident they can win and then they might vote for him. That’s the whole secret to momentum and translating momentum into votes.

Those voting for Chris Turner must turn out at high rates; if they do they can secure a seat in Ottawa. Amazingly, that doesn’t just mean people under 40 who would make the difference, but people of all ages. Random polls are reporting strong support for CPC among 18-34 which is plausible enough.

If we accept that Conservative voters are split by the political leaning of their particular candidate toward the far-right leaving the so-called Red Tories out, so that they either will not vote or vote Green (James Harris was a Red Tory who became leader of the Green Party), then Canada has the makings of a truly historic moment today, driven by citizens who are both engaged politically and fed up with being discounted.

Pathways to House of Commons for progressives

Calgary Centre progressives have a beautiful process here (despite its many limitations; most notable no cooperation by progressive parties) that says Turner has the greatest momentum and looks to be able to turn out the vote most effectively. The math says: show up and vote and a progressive will win.

If you were part of the 45% who didn’t vote in 2011 – that’s 39,000 people – realize it’s up to you. The relatively small but critical number of NDP supporters also have it in their hands to elect a progressive by voting for Turner or Locke. The Greens can do it by themselves if they turn out in unprecedented numbers for Chris Turner. These are the three main pathways on a fluid strategy canvas. As you can see, turnout, which truly is a measure of enthusiasm, is everything.

With this I await the returns from tomorrow’s vote to find out whether Calgary Centre isn’t just progressive in spirit, but filled with enough progressive voters to send a progressive to Ottawa to represent them.

And I applaud the 1CalgaryCentre team on persevering throughout this first-of-its-kind citizen experiment. It has made this by-election worth watching from afar.

(Full disclosure: as a tax-paying, long-time permanent resident in Canada I cannot and do not vote in any Canadian elections. I grew up in West-Germany, a democratic country with a proportional and representative voting system – one vote for a candidate in the riding, one vote for a party, which are combined to create German parliaments. Half the parliamentarians are elected directly, the other half via the party lists, which usually take geography into account. It’s a purposefully designed multi-party system that more often than not results in coalition governments, with coalition contracts forged during coalition negotiations following elections.)

Making information accessible

Long, comprehensive reports, however enthralling, are, at best, read by a minority. That’s why I have been preparing smaller, focused supplementary reports for the Value of Presenting study.

A report on Francophone minorities in Canada: La diffusion des arts vivants dans la francophonie canadienne (PDF)

Special Report Rural Northern Presenting (PDF)

And the presentation in Powerpoint format I used for a couple of webinars to discuss this information with rural and Northern presenters earlier this month:
Presentation Rural Northern Presenting Highlights (PDF)

A special report on Dance Attendance Supplementary Analysis (PDF)

These reports go beyond the Interim Report that spawns them by deepening the specific information from our survey of the Canadian public.

Additional segment reports will be published over the spring to help Canadian presenters and anyone interested see themselves more clearly in this sector-wide study.

Media coverage in articles and interviews is summarized in this post.