LOYALTY MARKETING SOLUTIONS

re:member group

re:engage

Communicate with your customers in meaningful ways

LOYALTY

Organizations that have loyal customers close more deals and sell more service.
Your customers expect more. Be the one to give it to them.
When you exceed their expectations they become advocates of your brand. 

Increased Visits

Loyalty program members are 70% more likely to make repeat visits.
 

Increased Referrals

Loyal customers become an extension of your brand and refer their friends, family and colleagues to your business.

Increased Sales

Loyal customers are far less price conscious. On average loyal customers spend 40% more than non-loyal customers.

Increased Feedback

It is important to know how you are doing, loyal customers are far more likely to fill out surveys and reviews.

YOUR CUSTOMERS DEMAND THE BEST

Since we’ve implemented our Loyalty program, we’ve experienced about a 10-15% uptick in sales from previous customers and almost a 15-20% increase in overall service sales. The re:member group is fast, responsive, collaborative and easy to work with, and their analytics dashboards make it easy for me to monitor and adjust.

Mike Martin – Auto Aves

SOLUTIONS

re:sults: what we do best

Points – Integrated, automatic, easy, effective

Motivate your customers by awarding them points towards future purchases. Get your very own customized points program that will get your customers coming back, again and again. Our points system is hands-off, tracked automatically. We do all of the work, so just kick back and watch your loyalty program come to life.

Rewards – incentivize, value, close

Close more sales by offering your customers additional rewards for doing business with you. Our partnerships put discounts at over 250,000 retailers in your customer’s hands. We can customize an incentive program that gives your customers real value by making them Elite card-carrying members.

eMarketing – Targeted, engaging, turn-key, powerful

Communicate with your customers in meaningful ways with a targeted email campaign. We have the knowledge to design a customized, brand-right email campaign that will break through inbox clutter and grab your customer’s attention.

NUMBERS SPEAK FOR THEMSELVES

6

Billion Points Awarded

19

Thousand Redemptions

488

Million Points Redeemed

13

Years of Business

ABOUT US

Who We Are

Founded in 2002 and headquartered in St. Louis Park, Minnesota, we created the perfect loyalty solution that will turn your customers into true fans. We specialize in automotive reward programs, as well as retail and services. Our customer retention programs are fully integrated and easy to use, perfect for people without any loyalty knowledge.

  • Why you should go responsive with your emails! Click the link to read our blog post. http://t.co/T849LbeGcU http://t.co/PsK1lihJSZ

    re:member group about 126 days ago
  • The goal as a company is to have customer service that is not just the best but #legendary. –Sam Walton

    re:member group about 176 days ago
  • Everyone likes to feel #appreciated! How do you show your customers that you care? http://t.co/z7YT4BsSiH

    re:member group about 176 days ago

EXPERIENCE IS KEY TO SUCCESS

  • A repeat customer spends 67% more than a new one.
     

    (BIA/Kelsey)
  • 68% say that coupons generate loyalty.
     

    (RetailMeNot)
  • 54% buy more from companies when they are being rewarded for purchases.

    (Points.com)
  • 69% said they would break habit and choose a different brand in order to earn more points or miles.

    (Points.com)

WORK

Mathews Rewards
ABC Just For Me Rewards
Auto Aves Advantage
Club OpenRoad
Mander Auto Club
Capstone Rewards

CONTACT

Get In Touch

INTERESTED?

CAPTCHA image

* These fields are required.


OUR ESTEEMED PARTNERS

BLOG

Database Admin Tips – Part 1

pointing-at-customers

Data is a critical part of business infrastructure, and managing that data is ever increasing in importance for a business to be able to function efficiently and effectively. It is crucial to make sure the data is well structured, optimized, and is of the highest integrity. At the re:member Group, we strive to ensure our customer data stays secure yet as available as is possible, keeping it efficient and maximizing usability as it grows larger and more complex on a daily basis.

I was hired on with the re:member group as a .NET/SQL developer, but my role quickly increased to take care of the databases in addition to a number of other responsibilities. The database systems they had at the time when I arrived, about one year ago, were certainly functional and generally operating pretty well. Still, there were a few items that needed improvement – it was clear that the database had been worked on by many, many people over the 12 years it had been in service, each with their own style of development and management. Plus, it was large enough and complicated enough that I had to do something to at least get an overview of what was going on. There were a number of routines and agent jobs that clearly needed improvement, as they would take far too long than was acceptable to run. Also, with the thousands of tables and stored procedures involved, it was clear that many were no longer in use. So the first thing I had to do was get an overview of the system, cut through the clutter, and start optimizing. As I’m not a fan of re-inventing wheels, so I found there are many, many good scripts on the internet for doing pretty much anything in SQL. There are also many very bad ones. Through experiment and modification I have collected a series of scripts which have ended up working very well. I will share some of the techniques I learned to accomplish the necessary goals. I’m going to skip a lot of the ‘why’ and focus on the ‘how’, in order to keep this concise and useful. There’s plenty of information on the ‘why’ parts available across the internet, I just want to focus on getting things done.

(note, you may see the lines commented out which look like –database_id=DB_ID(). When this is commented out, it will return results for the current database. When it is uncommented, it will return results for all databases. Feel free to modify as needed.)

So, lets dive right in. First, I needed to get a handle on exactly what I’m working with. So first, lets find out which tables are taking up the most space:

SELECT t.NAME AS TableName,s.Name AS SchemaName,p.rows AS RowCounts,SUM(a.total_pages) * 8 AS TotalSpaceKB
    ,SUM(a.used_pages) * 8 AS UsedSpaceKB,(SUM(a.total_pages) - SUM(a.used_pages)) * 8 AS UnusedSpaceKB
FROM sys.tables t INNER JOIN sys.indexes i ON t.OBJECT_ID = i.object_id 
	INNER JOIN sys.partitions p ON i.object_id = p.OBJECT_ID AND i.index_id = p.index_id
	INNER JOIN sys.allocation_units a ON p.partition_id = a.container_id 
	LEFT OUTER JOIN sys.schemas s ON t.schema_id = s.schema_id
WHERE t.NAME NOT LIKE 'dt%' AND t.is_ms_shipped = 0 AND i.OBJECT_ID > 255 GROUP BY t.Name, s.Name, p.Rows
ORDER BY TotalSpaceKB desc

OK, that’s good. Now lets find out which of these tables are actually being used. There is a very useful table called sys.dm_db_index_usage_stats which holds much of this information. Unfortunately, this tables’s statistics are cleared every time the server or SQL service get restarted. So what we need to do is to first is to create a table to hold this data, then create a procedure to copy the useful information from dm_db_index_usage_stats into this newly created table, and finally create an agent job to run every half hour or so which calls the procedure to update this table on a regular basis. So, first let’s create our table:

CREATE TABLE [dbo].[tbl_TableUsage]([Schema] [varchar](50) NOT NULL,[TableName] [varchar](255) NOT NULL,
	[LastReadTime] [datetime] NULL,	[LastWriteTime] [datetime] NULL,
 CONSTRAINT [PK_tbl_TableUsage] PRIMARY KEY CLUSTERED 
(	[Schema] ASC, [TableName] ASC
)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
) ON [PRIMARY]

Then, we need to create the routine which will update this table:

Create proc [dbo].[Get_ServerInfo]
as
declare @t table([Schema] varchar(50),[TableName] varchar(255),[LastReadTime] datetime,[LastWriteTime] datetime)
declare @latest table([object_id] varchar(255),last_user_seek datetime,last_user_scan datetime,last_user_lookup datetime,last_user_update datetime)

insert into @latest SELECT [object_id],last_user_seek,last_user_scan,last_user_lookup,last_user_update
   FROM sys.dm_db_index_usage_stats -- WHERE database_id = DB_ID()
   
insert into @t
  SELECT [Schema] = OBJECT_SCHEMA_NAME([object_id]),[TableName] = OBJECT_NAME([object_id]),
      LastReadTime = MAX(last_read),LastWriteTime = MAX(last_write)
  FROM
    (SELECT [object_id], last_user_seek, NULL FROM @latest UNION ALL
     SELECT [object_id], last_user_scan, NULL FROM @latest UNION ALL
     SELECT [object_id], last_user_lookup, NULL FROM @latest UNION ALL
     SELECT [object_id], NULL, last_user_update FROM @latest UNION ALL
     SELECT so.object_id,NULL,NULL FROM sys.objects so WHERE so.type = 'U' AND so.object_id > 100
   ) AS x ([object_id], last_read, last_write)
   GROUP BY OBJECT_SCHEMA_NAME([object_id]),OBJECT_NAME([object_id]) ORDER BY [Schema],[TableName]
      
insert into tbl_TableUsage select b.* from @t b where not exists 
   (select a.* from tbl_TableUsage a where a.[Schema]=b.[schema] and a.[TableName]=b.[TableName])

update tbl_TableUsage set LastReadTime = a.LastReadTime, LastWriteTime=a.LastWriteTime from @t a, tbl_TableUsage b
where exists (select c.* from tbl_TableUsage c where a.[Schema]=c.[schema] and a.[TableName]=c.[TableName])
and a.[Schema]=b.[Schema] and a.[TableName]=b.[TableName]
and ( a.LastReadTime > isnull(b.LastReadTime,'1900-01-01') or a.LastWriteTime > isnull(b.LastWriteTime,'1900-01-01'))

select * from tbl_TableUsage order by [schema],[TableName]

Finally, create an agent job to run this stored procedure once every half hour, or whatever is to your liking. Now you can either check tbl_TableUsage, or run the stored procedure Get_ServerInfo, and you will see the last time your tables have been accessed. Keeping in mind that some tables may only be accessed once a week, or even once a month, after some time goes by you will have a clear picture of what tables are actually in use.

Ok, now that we have a grasp on what our tables are up to, lets find out if there are any procedures or agent jobs that are ‘hanging’ or otherwise causing issues. First, lets check those agent jobs and see which ones are taking the longest times to run.

select j.name as 'JobName',run_date,run_time,msdb.dbo.agent_datetime(run_date, run_time) as 'RunDateTime',
run_duration,((run_duration/10000*3600 + (run_duration/100)%100*60 + run_duration%100 + 31 ) / 60) as 'RunDurationMinutes'
From msdb.dbo.sysjobs j INNER JOIN msdb.dbo.sysjobhistory h ON j.job_id = h.job_id where j.enabled = 1
order by RunDurationMinutes desc

It would then be a good idea to investigate what the agent jobs that are taking a very long time to run are actually doing, and seeing how they might be able to be optimized. If an agent job, or something else is hanging, the following should give you a good idea of what’s going on.

SELECT * FROM sys.dm_os_waiting_tasks  wt 
LEFT OUTER JOIN sys.dm_exec_requests c ON wt.session_id = c.session_id
CROSS APPLY sys.dm_exec_sql_text (c.sql_handle)

What’s especially useful is the last column, which gives you the text of the actual command that is causing the issue, which you can then track down and optimize/handle accordingly.

Finally, we want to prevent blocking, which is a very bad thing. This happens when two separate routines are attempting to access the same table at the same time, and for whatever reason one is locking the table, preventing the other routine from doing whatever it’s trying to do. This is not a good thing, as it can make routines fail, cause timeouts, and could prevent users from doing what they’re trying to do. Probably the easiest way I’ve found is to drop the results of the system stored proc “sp_who2″ into a temp table, look for the ‘blkby’ column to contain data, then you can see who is blocking whom with the ‘login’ and ‘hostname’ columns. Using techniques similar to what I’ve posted above, you could load the results into a temp table, create a procedure to check this and notify in some fashion when blocking occurs, and then run it in an agent job however often you’d like to check. Additionally, the ‘CPUtime’ and ‘DiskIO’ columns are useful to identify costly queries. I’ve not actually done this on my system for reasons I will explain shortly, so I do not have the scripts to post. However, if I had no extra DB tools, that’s what I would do.

Time for a shameless advertisement. I was fortunate to be provided with some database monitoring software which will monitor the health of the database server and databases for most any condition that can happen, help identify bottlenecks, resolve blocking, optimize queries, and lots, lots, lots more. It is incredibly useful for all levels of DB admins, from beginner to expert. It is called “Spotlight on SQL Server”, and more information about it can be found here http://software.dell.com/products/spotlight-on-sql-server-enterprise/. There is a free trial available and I highly recommend anyone to give it a try. I really cannot say enough good things about it, and I am in no way affiliated with it from a business standpoint.

So, I think this is all a pretty good start for getting your bearings on the system and seeing what is causing the most basic issues.

Next time, I will discuss indexes, another critical part of any database system that is going to run efficiently.

Thank you for reading, and good luck with all things SQL.

Lou

Do Robots Make Good Conversationists?

Have you ever lost your credit card and had to call in to get a new one? Picture this, you go to the grocery store and find yourself frantically digging for your card to pay. “It’s gone.” You think to yourself solemnly. Embarrassed, you tell the cashier you lost your card and scurry out of the store before anyone sees you can’t pay. You sit in your car and sadly dial a number into your cell phone, button by button. You dread this moment. You feel anxious and scared. You put the phone to your ear and….whomp whomp….it’s a recorded robot waiting to crush your hopes and dreams.  Monotonously it demands you to enter your social security number in. You don’t want to give your precious social security number to the robot who cannot be trusted. He is boring and unfriendly. You would love to talk to a real person for once. You throw your phone out the window in anger and watch as it gets run over by a big truck. You weep silently. Now you have no phone and no credit card. Your life is over. Have you ever been in this situation? Well I have and it is painful to converse with the robot on the telephone.

Nobody wants to talk to a machine. For one they are boring and often do not catch what we said correctly. We want to talk to people who can help us and listen to our issues. This is why robots do not carry conversations well. Having a customer service team to answer the phone and greet customers is key. Most of the time, real, live people will want to talk to real, live people. People will respect your company more if they can say, “Yeah, so and so is awesome! they helped me with my needs and were very friendly and I got to speak with a human.” I don’t know about you but I can usually tell if someone is kind and happy to help me through the phone. Even if the customer is frustrated, you can ease the situation by being kind and responsive. Being attentive to a customer’s needs is what they want. This is something a recording just cannot do. Pressing “zero” a million times is annoying. Do you think your customers will want to call if they cannot speak with anyone or if they have to work to get to you? Do you think customers will be more loyal and spend their hard earned dollars with you if you have only recorded customer service? Probably not. Think about the way your telephone customers are handled. Think about your own situations with recordings. Ask yourself, what can a robot do for me? The answer is, not much in terms of good customer service.

Positivity = Productivity

As a leader, are you conscious of the effect you have on your work environment? Every single interaction you have in the workplace has consequences.