Tag Archives: data

NCDM Notes: Pragmatic Analytics

Presented by Portrait and AAA South

Overwhelming amounts of data combined with rapid growth (especially in unstructured data) makes our job difficult: difficult to gather isight and difficult to implement chages once we have insight.  We also need to make sure we are solving the right problem, and not waisting time on results which are not actionable.

Analysts need to minimize time required to understand and prepare the data: tasks that are necessary before any analytics or modeling can be started. 

There are a couple of solutions:

First, create analytical tables that summarize and aggregate data that is now manually pre-processed.  AAA builds a big table that aggregates data at the household level.  It icludes activity summaries (could be transactional and promotional), prefereces, and  demographics.  For AAA this includes about 150 colums, with some colums updated weekly and some monthly.

AAA uses SAS to create this analytical table, but Portrait could do this too.  AAA is moving towards having Portrait create the analytical table.  This table is kept on its own server that the analytics department controls, unlike the primary DB surver (DB2), which MIS controls.

This more structred data table allowed AAA to help managers do some analytics theselves.

Second, use tools that ease the analytical and modeling process.  AAA uses Portrait software.  The output AAA uses is a matrix of various segment intersections, which simplifies the understand of how each segment will perform; it shows how each element in the matrix will perform relative to the norm.

These two efforts shortened AAA’s modeling process from 5 months to 2 months.

Take-away: Modeling can be easy.  You only need the right structure and tools.

NCDM Notes – Target Customers Effectively Through Advanced Analytics

Presented by Intuit & Netezza (a data warehousing company recently purchased by IBM)

The timeliess of integrating data into your marketing process is critical since data ages much more quickly than in the past.

Random note: Intuit is a significant shift from desktop software to SaaS tools, even for their financial products.

Intuit uses the usual variety of channels to acquire customers, but they have found paid ads work best for them.  They also have the usual problems of allocation and balancing spending across channels.  So what data can they use to help with this?  They use DoubleClick to track all converters and non-converters. (Does DoubleClick uses cookies and analytics to attribute customers to marketing efforts. No, it sounds like DC only provides the data, which goes in the Netezza warehouse.)

Omniture provides the data once a person is on their site.  Intuit uses Omniture only to provide data, which it passes to their data warehouse; they don’t use any of the Omniture analytical tools.  Omniture also reads the DC cookie so it can make the link between internal and external data.

With all the data showing what channels (aka media: online ads, organic search, ppc, affiliates, emails) a person was exposed to, they can then do the analysis to allocate the order. 

Intuit also looked at “channel interference” to see if one channel detriments another.  They saw only only a 3% occurrence of an affiliate click happening before a PPC click.  There was more of an overlap in the other direction (ppc click then affiliate) but it was still not at a concerning level.

The key point is that with prospect/exposure level data, they can do the analysis to see what channels or media overlaps.  (This is pretty exciting stuff!)

Behavior of customers is fairly consistent.  If they come in via a paid ad, they are likely to come back in via a paid ad.  If they come in via a specific PPC term, it is likely they’ll come back via that same term.  This suggests there is little cannibalization in the online channels.  (But what about canabilization between online and offline media.)

Renewal efforts at Intuit tanked for a short while when they showed returning customers only high-end products.  When they re-introduced the complete product line, including the free version, response bumped back up.

Since most of the discussion was regarding online marketing and attribution, I asked about their offline efforts and how they dealt with this cross-channel attribution.  The use customer unique vanity URLs (www.intuit.com/victornuovo, for example), which then gives them the data they need to align online and offline data.

NCMD Notes – Text and Sentiment Analytics: Trasforming Call Center, Social, and Survey Data into Customer Intelligence.

I was having a good conversation with the people as Experion, so I came in late to this presentation.

Take data: structured data & unstructured data from social and other external sources; use core functionality to transform that data to some meaningful format; then provide this transformed data to management and analysts.

The transformation of external and internal text type data to meaningful data is very difficult.  You always need to refine the algorithms.  Algorithms need to look at specific keywords that have a certain proximity.

How listening operations fit into DM cycle

Use input from listening services to trigger specific marketing events.  One example, if a person books a room, and mentions that they like a specific Mexican restaurant, then send them a coupon to that restaurant.  (Not sure this makes sense; aren’t they likely to go to this restaurant anyway?  Why give them a coupon?  Better example would be if they mention another, outside Mexican restaurant, then send them a coupon for your Mexican restaurant.)

Make sure you scan and capture internal text data, including data from your call center.  Again you’d need algorithms to transform the text to actionable data.

For example, looking for words “bedroom” and “smell” found in some text string could be a negative indicator. They found that it was important to set expectations before arrival, and then meet those expectations during the stay.

Don’t forget to capture chat text data, and link that to the customer.  Also capture incoming email text.

Monitoring allows you to rectify problems you might not otherwise be aware of.  You need to specifically look at spikes. It is helpful to segment people into promoters and detractors.  If there is a common criticism in both groups, then there is something you need to address. 

Very intersting presentation.  I think it is worth looking into the services Clarbridge provides.

Sources of Data: Rapleaf

I had a conversation with a rep at Rapleaf yesterday. They provide real-time data feeds and seem most notable for monitoring and collecting data from social networks. They provide both gender and age demographic data for free via their API.
A few interesting notes from our conversation:

  1. They maintain an Influencer Score that ranks how much influence a person has online. My contact mentioned that in their experience there is an inverse relationship between a person’s Influencer Score and the response to physical mailings: the more influence they have online, the less likely they are to respond to mailings.
  2. The opposite is true for email response, where there is a direct relationship between a person’s Influencer Score and their response to emails.
  3. It would be beneficial to target people with high scores, give them special coupons to pass along, or just generally make sure they are happy with their experience.
  4. In addition to the Influence Score, they maintain lots of other demographic data.
  5. By using the API, you could customize content (online and in emails) based on data from Rapleaf if we wished to do this. The API’s avg response time is 30 milliseconds.
  6. The data is expensive ~ $0.03 – $0.05 per field per record, although the rep did imply that there was room for negotiation if volume warranted it.

I will have to check out the API.