Global insights consultancy Kantar locates its first advanced analytics hub in region in Singapore

EDB & Kantar - LR, Hernan Sanchez, Tim Kelsall, Kelvin Wong.png

WPP-OWNED global insights consultancy, Kantar, is partnering the Singapore Economic Development Board (EDB) to launch its first research and development hub in Asia.

The Brand Growth Lab will focus on advanced analytics and use Big Data, artificial intelligence and machine learning to help companies grow their brands. It will have a strong innovation mandate and aims to transform unstructured data into insights that drive customer-centric decision-making and sustainable growth for companies.

In addition to the lab’s innovation mandate, the three-year collaboration between Kantar and the EDB also includes the hiring of data scientists and business designers, thus developing a strong pipeline of Singapore-based talent and expertise in this area.

The creation of the lab follows the establishment of similar analytics labs in London and Frankfurt, and this year’s launch of the Professional Services Industry Transformation Map (ITM), a roadmap that seeks to to develop Singapore into a global leader in the professional services industry.

Said Hernan Sanchez, Kantar Brand Growth Lab’s managing director: “Brands no longer have to rely on hunches, but can instead substantiate their decisions based on intelligent analytics.”

Kelvin Wong, EDB’s assistant managing director, said: “We are delighted that Kantar has chosen Singapore to locate its first advanced analytics hub in Asia. Singapore’s professional services sector is growing, and Kantar’s decision is testament to this.”

[“Source-businesstimes”]

Continuous Testing Insights from 2018 DevTest Research

Continuous Testing Insights

The year is far from over, but there already several interesting DevTest surveys worth your attention. These studies don’t just quantify the obvious; they actually report some unexpected findings regarding how far and how fast we’re advancing, and offer some very specific advice on what’s needed to improve.

We strongly recommend that you spend some time reading all three of these surveys in their entirety. However, in case you’re short on time (or impatient … or both), we wanted to highlight the findings that are most pertinent for readers practicing or researching Continuous Testing.

 

Sauce Labs – Testing Trends for 2018: A Survey of Development and Testing Professionals

 

[Read the complete report]

 

2018 marks the fourth annual “Testing Trends” report, which is based on a global survey of more than 1,000 technology professionals responsible for developing and testing web and mobile applications.

 

Key findings in terms of testing include:

 

  • 87 percent report that management supports test automation initiatives.
  • 45 percent expect to increase spending on test automation in 2018 (55 percent at large companies).
  • The number of respondents with high levels of test automation dropped to 28 percent in 2018 from 32 percent in 2017 .
  • The release cadence is actually slowing, with hourly deployments dropping to 5 percent from 14 percent and daily deployments dropping to 27 percent from 34 percent.

 

In other words, everyone recognizes the value of test automation and most companies are willing to invest in it. However, test automation rates are actually decreasing, while Agile and DevOps adoption are steadily increasing. In the 2017 report, test automation rates increased slightly, and delivery speed also increased slightly. The 2018 reported a similar correlation: Test automation rates decreased, and the release cadence slowed down.

 

GitLab – 2018 Global Developer Report

 

[Read the complete report]

 

This expansive survey polled 5,296 software professionals from around the world. The majority of respondents were software developers or engineers who worked for small- to medium-sized businesses (SMB) in the hardware, services and SaaS industries.

 

Testing wasn’t a common topic in this development-focused research, but it did earn a prominent spot in the report. Testing was the No. 1 response to the question, “Where in the development process do you encounter the most delays?” A dubious honor—but not a surprising one. Last year’s DevOps Review polled an entirely different audience and came up with the exact same finding.

 

VersionOne – 12th Annual State of Agile Report

 

[Read the complete report]

 

The 12th edition of the world’s longest-running Agile study found that while 97 percent of the 1,492 respondents’ organizations are practicing Agile, 84 percent report that their Agile adoption is not yet mature.

 

Respondents feel strongly that two testing-related items would help them increase process maturity across both Agile and DevOps:

 

  • 83 percent want end-to-end traceability from business initiative through development, test and deployment.
  • 82 percent want better identification and measurement of risk prior to deployment.

 

Respondents also reported a relatively high level of adoption of development testing and “shift left” testing techniques. Adoption levels were reported at:

 

  • Unit testing – 75 percent.
  • Coding standards – 64 percent.
  • Pair programming – 36 percent.
  • TDD – 35 percent.
  • BDD – 17 percent.

 

Testers might also be interested in the survey’s feedback on Agile management tools. Usage rates were reported at:

 

  • Atlassian Jira – 58 percent.
  • VersionOne – 20 percent.
  • Microsoft TFS – 21 percent.
  • HP (now Micro Focus) Quality Center / ALM – 14 percent.

 

The most highly recommended tools were VersionOne, Jira and CA Agile Central. HP Agile Manager, Hansoft and HP Quality Center /ALM were the least likely to be recommended.

[“Source-devops”]

Helping close divisions in the US: Insights from the American Well-Being Project

Image result for Helping close divisions in the US: Insights from the American Well-Being Project

Editor’s Note:The American Well-Being Project is a joint initiative between scholars at the Brookings Institution and Washington University in St. Louis.

Issues of despair in the United States are diverse, widespread, and politically fueled, ranging from concentrated poverty and crime in cities to the opioid crisis plaguing poor rural towns. Local leaders and actors in disconnected communities need public policy resources and inputs beyond what has traditionally been available.

Scholars at Brookings and Washington University in St. Louis are working together to analyze the issues underlying America’s disaffection and divisions in order to provide policy ideas for a better, more inclusive future. Through on-the-ground community research in Missouri—a microcosm of America’s problems—as well as the application of ongoing policy research, we hope to develop approaches that can tackle factors like lack of access to health care, scarcity of low-skilled jobs, weak education systems, and hollowed-out communities.

Simply put, we are asking how has the American Dream been broken and how can it be restored?

WHAT WE KNOW AND WHAT IS MISSING

In general, indicators such as economic growth and unemployment rates continue to improve in the U.S., as do some markers of well-being, such as longevity. Yet the aggregate indicators mask inequality of access and outcomes. Such indicators do not account, for example, for the decline in prime age male labor force participation, nor do they reflect the rising numbers of “deaths of despair” due to opioid or other drug overdoses, suicide, and other preventable causes. Such deaths are concentrated among less than college educated, middle-aged whites.

The past few decades have also seen a dramatic increase in the disability rate (the number of disabled Social Security beneficiaries), greater income inequality, and stagnating mobility rates. Different regions have had divergent fortunes, meanwhile, and many, particularly in the heartland where manufacturing has declined, are characterized by “left-behind” populations in poor health and with little hope for the future, and a hollowed out middle-class.

As such, the macro numbers simply do not capture the full picture of inequality, public frustration, and socioeconomic distress. Well-being metrics could be part of the solution in understanding trends among and across subpopulations.

Looking back on recent episodes of political upheaval, previous decades produced clear indicators that should have been seen as red flags for the current crisis. If we can better identify these risk factors in advance, then we can provide appropriate policy recommendations to those working in communities most affected, as well as anticipate the challenges of those populations and places at greatest risk.

HOW CAN RESEARCH AND DATA BE USED AT THE LOCAL LEVEL? THE APPLICATION OF SUBJECTIVE MEASURES

As we further explore metrics of well-being, the question will be how to analyze data in a way that is useable and valuable to local leaders. While well-being measures offer interesting insights, they are inherently subjective and focused on mindset rather than quantitative outcomes. Pairing well-being measures with traditional “hard” measures like GDP and employment rates has proven useful in the past.

As shown by research in Peru into the relationship of traditional economic and social measures to perceived well-being, status, identity, and inclusion, hope is a significant factor in determining success. People who are more hopeful tend to have better economic and social outcomes.

Communities should also strive to achieve a balance between hope and realism. Although our research shows that hope is a key determinant of well-being, excessive optimism can easily lead to disappointment.

Personal responsibility for success is also an important factor. To the extent that people blame themselves (or their neighbors) for the current social and economic challenges, pressure for policy responses is lost. Too much blame on individual agency makes a community unwilling to try to make things better through policy. The goal should be to achieve a healthy balance of outlooks, personal responsibility, and realistic understanding of chances for success.

Better indicators of people’s outlooks on life combined with indicators of opportunity and deprivation could help achieve this at the grassroots level. Novel approaches that combine quantitative and qualitative data can inform a range of community efforts. Scholars at Washington University have already taken the lead by using national data from call-in distress services for individuals and families, with the goal of identifying specific geographic information, down to the neighborhood level, on vulnerable areas.

Brookings scholars actively participated with the state of Colorado to implement a comprehensive system for monitoring mobility and opportunity—the Colorado Opportunity project, and in a separate effort, with the city of Santa Monica to design an effort to regularly monitor a range of well-being dimensions.

NEXT STEPS

Now is an opportune moment for local, regional, and state leaders to make positives changes in communities, rather than waiting for action at the federal level. And, given the complex nature of our crisis of divide and desperation, policies must be better targeted to different age, racial, and socioeconomic groups—and their circumstances, something best achieved at the local level.

Even if analyses and practices are adapted for specific geographic regions and demographic groups, local governance challenges will still make implementation difficult to achieve on the ground. Many communities lack local leadership and empowered community organizations. Nongovernmental organizations, state level governments, and even the private sector can help fill the leadership void in communities and support existing local efforts.

The fact is that the issues of despair in America have no one answer, nor does the responsibility fall on a single sector, institution, or group of people. It will take a concerted effort from many stakeholders, focusing on an immense set of challenges that differ from community to community.

Our collaboration between Brookings and Washington University aims to help those taking the lead by providing valuable data, analyses, and policy ideas.

[“Source-brookings”]

Supercharging your SEO with AI: Insights, automation and personalization

Recently, I had the pleasure of presenting at SMX London on Supercharging your SEO with AIand thought I would share some of the insights with Search Engine Land readers.

Google made global headlines with the demonstration of its new Duplex at this year’s I/O developers conference. This artificial intelligence (AI) system can “converse” in natural language with people to schedule an appointment at a hair salon or book a table at a restaurant, for example.

To pass the Turing Test, AI must behave in a manner indistinguishable from that of a human. To many, Google Duplex has proven that it can pass this test, but in truth, we are only seeing the beginnings of its future potential.


This particular use of AI made headlines because people are drawn to applications of AI that can mimic human interactions, whether in science fiction or in real life. While that response is driven by fascination, it is also host to an element of fear.

Can AI replace people?

As marketers, we typically encounter two perspectives on this. Either AI will take our jobs and render us obsolete or it will complement our skills and make us more effective.

According to a study by the Economist,  75 percent of executives say AI will be “actively implemented” in companies within the next three years, so this is more than a hypothetical discussion.

As hype turns to reality, we are realizing that the second perspective is the likely outcome. This would certainly be the most beneficial outcome, with PricewaterhouseCoopers predicting that AI will add $15.7 trillion to global GDP annually by 2030.

Moreover, AI is already all around us, embedded in products and services we use every day, like Netflix and Pandora.

Perhaps most pertinently to us as marketers, AI is deeply embedded in search, and it opens a raft of new opportunities for SEOs that embrace this technology early.

The role of AI in search

Artificial intelligence is making search more human. Although search does not yet “speak” to users in the same way the Google Duplex demo could, its objective is very similar.

Google’s RankBrain technology uses machine learning to understand the meaning of the content it crawls; it infers intent from ambiguous search queries; and it uses feedback data to improve the accuracy of its results.

In other words, it listens and it learns.

Though we may not always have visibility over these processes, we do see the outputs very clearly. Research by BrightEdge (my company) into a dataset of over 50 million keywords revealed that 84.4 percent of queries return universal search results. This occurs as Google uses AI to match the layout of search results pages to the user’s intent.

There are now 37 different search engine result page (SERP) categories, a number that will only increase over the coming months and years.

The potential for personalization has not yet been truly tapped, but Google’s Sundar Pichai recently made public its goal to be an “AI-first” company. As such, we should all expect the search landscape to change dramatically as AI takes center stage in the way it has already done in products like Google Photos and Google Lens.

As co-founder Sergey Brin put it:

“AI touches every single one of our main projects, ranging from search to photos to ads.”

The pace of development on this front is accelerating, as Google is all too aware that AI can simply deliver better, more personalized experiences for consumers. However, search marketers need to pay close attention to these technological advancements if they are to avail themselves of these opportunities for SEO.

How to supercharge SEO with AI

There are three key areas in which AI can improve SEO performance:

  • Insights.
  • Automation.
  • Personalization.

Insights

Artificial intelligence can process and interpret patterns in data at a scale people could simply never replicate. This makes it an essential complement to any search strategist, as AI can deliver the information we need to make informed decisions out of noisy, unstructured data.

Some common tasks where AI can aid search engine optimization (SEO) performance include:

  • Market trends analysis.
  • Site performance analysis.
  • Competitor insights.
  • Customer intent reports.
  • SERP performance.
  • SEO and pay-per-click spend management.

In each of these scenarios, AI can surface new insights that may otherwise have gone unnoticed. As search moves beyond the traditional SERP and becomes a multidisciplinary channel, this will be increasingly important. New developments like visual search are bringing to light the central role of AI in processing new types of media, too.

Pinterest uses deep learning to interpret the content and context of images, opening up new opportunities for retailers to capitalize on “discovery search.”


Google Lens plans to use augmented reality to blend the physical and virtual worlds, using objects as queries rather than typed keywords.

Of course, these developments will lead to the creation of invaluable data, with each interaction revealing something new about our audience. As marketers, we should employ AI to ensure that we capture, process and use this data correctly to shape our search strategies.

How can you use AI for SEO insights?

  • Understand underlying need in a customer journey.
  • Identify content opportunities.
  • Define opportunity space in the competitive context.
  • Map intent to content.
  • Use structured data and markup.
  • Invest in more long-tail content.
  • Ensure content can be crawled and surfaced easily by all user-agents.

Automation

SEO is a labor-intensive industry that requires a huge amount of attention over the long term. Where we can automate tasks to receive the same output we could produce ourselves, we should make this a top priority. The time saved through automation can be applied to the areas that require our skills, like strategy and creative content.

The chart below shows the average amount of time spent on the essential but at times repetitive task of keyword research based on the size of the site in question.

Here are some of the tasks that are ripe for automation in SEO:

  • Technical audits.
  • Keyword research.
  • Content optimization.
  • Content distribution.
  • Tag management.
  • Internal linking.

In these instances, computers do replace people, but we are in control of what they do, and it is a logical decision to hand over such tasks to artificial intelligence. In the process, we can free up valuable time to take on the more challenging aspects of SEO strategy.

Some tips to get started with AI for SEO automation:

  • Break down tasks into sub-tasks, then score their potential for automation from 0-10.
  • Use rule-based automation to handle simple but time-intensive jobs.
  • Find the right balance between human labor and automation.
  • Feed ML algorithms the right quality and quantity of data.
  • Focus on user experience and speed monitoring and alerts; engagement rates will only increase in importance.

Personalization

Personalization allows marketers to create relevant, useful experiences for each individual customer. Achieving this at scale requires technological assistance, with AI an integral part of this process.

Amazon has long been regarded as the market leader in personalization, as it takes user data to suggest new products based on their historical activity. This allows Amazon to surface products that do not typically receive much visibility, based on their relevance to each individual consumer.


Search marketers can take a number of lessons from this approach.

By mapping content to different states of intent, we can capitalize on these opportunities to cross-sell additional products.

This starts to move beyond traditional SEO and into the realm of vertical search optimization. We can see this trend in Google’s recent announcements, namely the integration of Assistant into Google Maps and the upgraded Google News app.


Content discovery is no longer limited to the search results page, so marketers must truly understand their consumers to ensure they can engage with them, anywhere and at any time.

Artificial intelligence is of vital importance at every stage of this journey. The field of predictive analytics, which makes predictions based on patterns in historical data, can help marketers to plan their content to meet consumer demand states.

How can you use AI for SEO personalization?

  • Create content by persona, customer journey stage and delivery mechanism.
  • Enhance user experience and conversion through personalization.
  • Use semantically specific pages to associate query and intent.
  • Use personalization and audience lists to nurture leads across search and social.
  • Use AI to help publish content at the right times on the right networks.

Conclusion

The artificial intelligence revolution is already upon us, and sophisticated marketers are taking advantage!

Most AI systems are invisible, but that does not lessen the significance of their inner workings. The search landscape is in constant flux, and consumers are creating vast amounts of data, all of which can be turned into insights.

Automation can help us make sense of these insights and free enough time for marketers to develop innovative, personalized strategies. There can be little doubt that the tech giants have gone AI-first. Marketers who follow suit can supercharge their SEO strategies by using AI in three core areas: insights, automation and personalization.

[“Source-searchengineland”]