Browsing Posts in Simplex-dba

Did you like the database webinar last month ? The one about Corporate Data Review? If you did, then you’ll love this upcoming free webinar!

Welcome to our fourth monthly Database webinar. Doug Tombow will introduce you to some key concepts as well as provide some understandable examples of what you should be thinking about regarding getting useful information from your existing corporate data.

Topics presented:

  • What’s the difference between data and information?
  • Information drives our best decisions!
  • Gleaning information from data
  • A workable starting point

When: Tuesday, April 28, 2015 from 11:00 AM to 11:30 AM (EST)

Click here to RSVP

The ‘information age’ that surrounds us tends to lack, um, information.  Oh sure, there are plenty of devices and systems collecting data and making it available to us.  But, how easy is to gather the data we need to make a good decision?  Or how useful is that data when we want to make an important decision?

Data and information are very different things.  One is the raw ingredients, the other the finished product.  In other words, data is what we use to create information.

A simple example is a record of a call we received on our cellphone.  It will contain the date & time of when the call was made, the source number, the destination number and the date & time of the end of the call.  Is that record data or information?  Sometimes the distinction is a bit unclear.  That is – without the proper context – the record can appear to be either data or information.

In this example, the call record can be considered information if the question is ‘who called this morning?’  That question can be easily answered with the data contained in the call record.  However, it can only be considered data if the question posed is ‘how often does that caller interrupt my mornings?’  Our context is different (a higher level perspective) which causes the exact same record to be viewed not as the final information but rather ‘one of many’ data items that needs turned into information.

How do we go about transforming data into information?  Ironically, it is by applying some type of ‘data processing’ to the data.   For our call records the processing is simply adding up the total minutes over a certain period of time.  And this creates new data in the process – the total minutes used.  This is considered aggregated data and is the foundation of getting information from the data we have.  As we apply additional data processing to the data – along with additional perspective – we begin to extract the meaningful information our data holds.

Is ‘processed data’ automatically useful information?  No.  Not unless the data is accurate, the processing is accurate and the proper perspective was applied the whole way through.

Also, the data must be a suitable foundation for the information we desire and the processing of it must not change its suitability.  Then, we must apply the appropriate perspective when using the information.  Poorly collected data, processed with inappropriate logic is actually just (more) useless data.  And the world already has too much of that!

Hopefully, you can see the key to turning data into useful information involves much more than simply collecting the data and running a report.  It involves understanding what information we want to get out of the data along with the context which will be used when we analyze the information.  It’s all related and it makes a huge difference in the usefulness of our information.

What tools can we use?  There are a number of tools that we can use to retrieve data.  Each of them have their strengths and weaknesses.  Items like:

  • End User tools (ie, Microsoft Excel, Access)
  • Developer tools (Visual Studio)
  • Server Level Tools (ie SSRS, part of Microsoft’s SQL Server)

The concepts are actually pretty simple, but it might not be immediately obvious.  So let’s get started right now!  Let’s undertake the task of better understanding how to transform our corporate data into useful information.  Information that can be used across all of our business processes.  As usual, it’s actually a simple process that every business owner can get started with immediately.

Take the first step by attending the upcoming Simplex-DBA webinar ‘Turning Data into Information’ on Tues, 4/28/15 at 11 AM EST.  During that session we’ll cover this topic along with a few others to help us understand how we can get the better information from our business data and systems.  After all, that’s the reason we have collected the data in the first place!

I hope you all can attend and look forward to seeing you online.

Doug

_________________________________________________________________

Simplex-dba is part of Simplex-IT, which is an award winning IT services organization located in Hudson, Ohio (between Cleveland and Akron).  Simplex-dba is aimed at maintaining the health, security, performance and usability of corporate data (focusing on Microsoft SQL Server) in the Small to Medium Business world.  Our offerings include monitoring, management, training, proactive maintenance and strategic guidance.  We do so at a fraction of the cost of a full-time dba (DataBase Administrator) or those “big” consulting firms.

 

Corporate data is important.

Most corporations keep track of their operations.  How they service their customers.  How they work with their vendors.  Cash flow.

We usually gather this data through applications, and the data is “structured,” meaning it’s contained using a technology like SQL, Oracle or the like.

Our topic for our Lunchinar this month is “Keeping Corporate Data Healthy.”

Data is critical to your business.

It needs to be healthy and reliable.  It needs to be there when you need it.  And it needs to provide you with answers to questions that assist in your operations, both strategically and operationally. In other words it needs to be ‘good’ data.

Do you have confidence your data is good?  Do you proactively deal with issues involving your data and have confidence that it is healthy?  That it will ‘be there’ when you need it – accurate, secure and ready to be used?

Having comfort and confidence in your data is a simple matter of addressing the ‘3 pillars’ of good data.

  • Health:  your data can be used when you need to use it.
  • Performance: your data is available quickly enough to be useful.
  • Usefulness: your data serves its intended purpose, but can be readily used for others as well.

This month, we’re going to focus on “Health.”  This includes:

  • Backups – proven to be usable should the need arise
  • Integrity – your data is not corrupt and can be relied upon for decision making
  • Security – your data is available only to those you choose to have access
  • Availability – it is able to be used when you need it

Doug Tombow (who heads up our new Simplex-dba practice) and Bob Coppedge will be demonstrating some of the tools and tactics we’re using to manage and maintain the health of some of our customers.

Join us!  It’s free, there’s going to be food (unless you join us online, in which case you’re on your own), we’ll give away a copy of Windows 8.1 and who knows what else?!

Click here to RSVP and more information

When:  Wednesday April 15th, 11:30-1pm (eastern)

Simplex-dba is part of Simplex-IT, which is an award winning IT services organization located in Hudson, Ohio (between Cleveland and Akron).  Simplex-dba is aimed at maintaining the health, security, performance and usability of corporate data (focusing on Microsoft SQL Server) in the Small to Medium Business world.  Our offerings include monitoring, management, training, proactive maintenance and strategic guidance.  We do so at a fraction of the cost of a full-time dba (DataBase Administrator) or those “big” consulting firms.

At Simplex-IT, we specialize in sharing our knowledge with several free webinar and Lunchinar events each month on topics such as Microsoft Office, Project Management and Data Practices.  Contact us at Info@Simplex-IT.com, Twitter (Simplex_IT), LinkedIn (http://www.linkedin.com/company/simplex-it) or FaceBook: (http://www.facebook.com/simplex.it).  You can also check out our YouTube channel with over 100 videos at https://www.youtube.com/user/SimplexITBob

 

Free Webinar: “Corporate Data Review – What’s That?”

Did you like last months article (and webinar) about Database Performance? If you did, then you’ll love this upcoming free webinar!

Welcome to our third monthly Database webinar. Doug Tombow will introduce you to some key concepts as well as provide some understandable examples of what you should be thinking about regarding assessing your existing corporate data.

Topics presented:

  • What is a Corporate Data Review?
  • How Useful is Your Corporate Data?
  • Determining the ‘usability’ of your corporate data
  • Enhancing the usability of corporate data

When: Tuesday, March 24, 2015 from 11:00 AM to 11:30 AM (EST)

click here to RSVP (and possibly win a copy of Microsoft Office Professional 2013)

A Corporate Data Review is just a fancy phrase for something we probably are doing already.  Assessing the usefulness of our business data and systems.  But what we are probably not doing is being proactive about that process.  In other words, formally reviewing our data and then taking action when we discover the information is not quite as useful as we imagined.

It happens all the time.  We deploy a new ‘wonder system’ that will ‘manage all of our data and provide information at our fingertips’.  It’s promised to require very little effort on our part and a wonderful work experience each day.  We’ll know everything about anything with just a couple of clicks of our mouse.  What could be better?

But then we soon discover there are some limitations in the wonder system.  Ones that actually prevent those benefits from being realized.  There was a problem getting the historical sales data loaded.  Someone decided to issue new customer numbers and now we have two accounts for each of our customers.  The programmer’s idea of ‘ease of use’ requires me to open 5 different forms to update a customer address.  Unfortunately, these are all too common occurrences in new systems.

What can we do about it?  Well, the starting point is to have an understanding of your existing business systems and data.  What applications do we use, what data do they contain, where is that data stored, what format is the data saved in?  How many of us can answer those questions accurately?  It’s not easy when you have 10+ systems installed by a variety of vendors over time.

When our latest system was purchased how much analysis of the data was performed?  From what we see in the industry it’s usually the ‘exact amount required’.  That is, the exact amount required to get the system operational (to some degree).  Notice that may not coincide with the exact amount required to fully benefit from your existing data and your new system.  Data analysis and systems integration are specialized skills.  Skills that take years to master.  Skills that are far beyond those required to install a new system, load some files and give the users an overview on how to use the screens.

This is why the promises of ‘information at our fingertips’, ‘a holistic view of your entire customer relationship’ and others like them are usually not fully realized.  Because in the end no-one is analyzing our data and making the required decisions over time to ensure it is re-usable into the future.  They’re just installing new systems.

Let’s change that right now!  Let’s undertake the task of determining the best way to use our corporate data across all of our business processes.  It’s actually a very simple process that every business owner can get started with immediately.

Take the first step by attending the upcoming Simplex-DBA webinar ‘Corporate Data Review – What’s That?’ on Tues, 3/24/15 at 11 AM EST.  During that session we’ll cover this topic along with a few others to help us understand how we can get the maximum value from our business data and systems.  Because that’s the reason we have these systems in the first place!

I hope you all can attend and look forward to seeing you online.

Doug

Database performance is a fun topic (read our other blog entry on this by clicking here).  ”I’m running reports overnight.  They take 3 hours!”  Not necessarily the end of the world (the completed reports are still waiting for me in the morning).  ”There’s a customer on our web site.  It takes 15 seconds to see their order status.”  Ok, now we’ve got a problem.

“Sooooo…let’s buy new hardware.”  That’s the quickest and often the easiest (in terms of justifying the expense) sell to management.  Because more…more cpu, more memory, more disk…that’s going to fix it, right?”

Not necessarily.  If you’re not measuring what was actually happening during those 15 seconds, then you don’t know why it took so long.  That’s why we have Performance Counters.  Metrics we can get (in real time, if needed) to tell us what’s actually going on.  Why it took 15 seconds (or 3 hours, for that matter).  And the conclusions we can come to from these metrics?

Welcome to our second monthly Database webinar.  Doug Tombow will introduce you to some key concepts as well as provide some understandable examples of what you should be thinking about regarding the performance of your data.

Topics presented:

  • Why does Data Performance matter?
  • Data Performance is Relative
  • Determining Your Required Data Performance Levels
  • Monitoring Data Performance
When:
Tuesday, February 24, 2015 from 11:00 AM to 11:30 AM (EST)
RSVP by clicking here, and you’ll receive connection instructions the day of the event.

The presentation will be aimed at the non-technical level.

And a lucky attendee will get a free copy of Microsoft Office 2013 Professional!

 

Bob here.  Database performance is a fun topic.  ”I’m running reports overnight.  They take 3 hours!”  Not necessarily the end of the world (still waiting for me in the morning).”There’s a customer on our web site.  It takes 15 seconds to see their order status.”  Ok, now we’ve got a problem.

“Sooooo…let’s buy new hardware.”  That’s the quickest and often the easiest (in terms of justifying the expense) sell to management.  Because more…more cpu, more memory, more disk…that’s going to fix it, right?

Not necessarily.  If you’re not measuring what was actually happening during those 15 seconds, then you don’t know why it took so long.  That’s why we have Performance Counters.  Metrics we can get (in real time, if needed) to tell us what’s actually going on.  Why it took 15 seconds (or 3 hours, for that matter).  And the conclusions we can come to from these metrics?

Two words:

It depends

Read on!



This article excerpt, by Robert L. Davis, originally appeared here: http://bit.ly/1ryLksn
When you’re working through some vague performance issues (e.g., “SQL Server seems slow today”), one of the common things to do is to collect some performance counters. If you are collecting performance counters for the first time on the server, you don’t have anything to compare them against. This usually leads to searching the web for resources that will tell you what numbers the counters should be. And sadly, it quite often ends with either misleading advice or with a disappointing message of “it depends.”
There are several reasons why we say that the target values for performance counters depend, and why we say that you need to baseline your systems.
•Workloads and server configurations vary wildly.
•Workloads change.
•SQL Server environments are constantly evolving.
Baselining
So we tell you to baseline to know what your system generally looks like when it’s healthy. If you are baselining, you can compare your performance counters to last week’s numbers or last month’s number or even last year’s numbers. But if you’re not already baselining, and you have an emergent issue that you need to investigate right now, this advice is not going to help with this issue. Most of the performance counters are not going to be very helpful, and you will need to dig into the current activity on the server. Look for obvious things like blocking and extremely high degrees of parallelism. Look at the wait statistics for the currently active requests and try to determine if we have an issue with memory or CPU utilization or other bottlenecks.