Scott Howser, vice president of marketing for Hadapt, joined Dave Vellante and Jeff Kelly of Wikibon on theCUBE this afternoon, broadcast live from the MIT Chief Development Officer Information Quality Symposium.
Before discussing the Hadoop ecosystem that is being widely adopted, Howser reflected on the early days of Hadoop and the authentication challenges which themselves have evolved as the ecosystem has been maturing.
Early data practitioning required performing triangulation on an individual user in order to engage them and determine their authenticity. As Howser noted, utilizing a single sign-on was the Holy Grail. However, this method is no longer able to be a reality between a provider and a user. As additional channels are brought online, verification of whom a user says they are becomes increasingly more complicated. This is true, in no small part, due to a dramatic change in engagement where there are not equal requirements allowing a user access to data.
Prior to the addition of the concept of omnichannel, databases had to be both small and disparate. Upon merging the collected data, drawing any knowledge from the analytics of the pool of information was on the scale of months and years. The need to be able to move faster on collected data led directly to revolution in the Hadoop ecosystem.
Addressing a reconciliation of data quality with omnichannel and schema, Howser commented, "We allow users to dump all that data into one schema." By doing this, a user is able to normalize data through iterations in Hadoop. As you provide access to more channels, a marketeer is able to draw from mountains of data in one common repository. A direct result of this is a marked decrease in the time required to cull and normalize data, allowing the user the ability to interact with it in a more timely and meaningful way.
Howser acknowledged there would naturally be at trade-off between accuracy and performance. He stated this could be mitigated by building in models that allow you to decide that the more rapid analysis would be good enough. Aiming for an 80 percent accuracy rate would, in his opinion, justify the expedited analytics. "The element of time...the time it takes to define something that rigid takes months," Howser said. That span of time is not practical for the operation of a business or organization. "Let's fail fast. Do a lot of iterations. You set some sort of confidence in this particular application."
Vellante commented how the role of CDO is only a decade old. When considering the role of CDO is to try to achieve information quality, and with Hadoop effecting a change of that model, Vellante queried Howser asking, "How do you see the notion of information quality adapting?" Howser responded the CDO should really have an unparalleled command of the entire business. The CDO should be able to accelerate what the business is trying to accomplish by applying quality information to help the business solve its problems.
The MIT CDOIQ Symposium is being held on the campus of MIT in Cambridge, Massachusetts. Cambridge, as it turns out, is also the new home city for Hadapt HDQ. Howser discussed how the move into the heart of the tech corridor has been a boon for recruiting and dissemination of the benefits of Hadoop. "I believe Hadoop is the operating system of big data. I stand behind that." The future maturity of Hadoop is signaled by the rapid growth of Hadept. "What people are engaging us to do is transition from legacy methodologies."
Scott Howser, Hadapt, at MIT Information Quality 2013 with Dave Vellante and Jeff Kelly
@thecube
#MITIQ
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
MITChief Data Officer and Information Quality Symposium (CDOIQ) | 2013. If you don’t think you received an email check your
spam folder.
Sign in to MITChief Data Officer and Information Quality Symposium (CDOIQ) | 2013.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open this link to automatically sign into the site.
Register For MITChief Data Officer and Information Quality Symposium (CDOIQ) | 2013
Please fill out the information below. You will recieve an email with a verification link confirming your registration. Click the link to automatically sign into the site.
You’re almost there!
We just sent you a verification email. Please click the verification button in the email. Once your email address is verified, you will have full access to all event content for MITChief Data Officer and Information Quality Symposium (CDOIQ) | 2013.
I want my badge and interests to be visible to all attendees.
Checking this box will display your presense on the attendees list, view your profile and allow other attendees to contact you via 1-1 chat. Read the Privacy Policy. At any time, you can choose to disable this preference.
Select your Interests!
add
Upload your photo
Uploading..
OR
Connect via Twitter
Connect via Linkedin
EDIT PASSWORD
Share
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
MITChief Data Officer and Information Quality Symposium (CDOIQ) | 2013. If you don’t think you received an email check your
spam folder.
Sign in to MITChief Data Officer and Information Quality Symposium (CDOIQ) | 2013.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open this link to automatically sign into the site.
Sign in to gain access to MITChief Data Officer and Information Quality Symposium (CDOIQ) | 2013
Please sign in with LinkedIn to continue to MITChief Data Officer and Information Quality Symposium (CDOIQ) | 2013. Signing in with LinkedIn ensures a professional environment.
Are you sure you want to remove access rights for this user?
Details
Manage Access
email address
Community Invitation
Scott Howser, Hadapt - MIT Information Quality 2013 - #MIT #CDOIQ #theCUBE
Scott Howser, vice president of marketing for Hadapt, joined Dave Vellante and Jeff Kelly of Wikibon on theCUBE this afternoon, broadcast live from the MIT Chief Development Officer Information Quality Symposium.
Before discussing the Hadoop ecosystem that is being widely adopted, Howser reflected on the early days of Hadoop and the authentication challenges which themselves have evolved as the ecosystem has been maturing.
Early data practitioning required performing triangulation on an individual user in order to engage them and determine their authenticity. As Howser noted, utilizing a single sign-on was the Holy Grail. However, this method is no longer able to be a reality between a provider and a user. As additional channels are brought online, verification of whom a user says they are becomes increasingly more complicated. This is true, in no small part, due to a dramatic change in engagement where there are not equal requirements allowing a user access to data.
Prior to the addition of the concept of omnichannel, databases had to be both small and disparate. Upon merging the collected data, drawing any knowledge from the analytics of the pool of information was on the scale of months and years. The need to be able to move faster on collected data led directly to revolution in the Hadoop ecosystem.
Addressing a reconciliation of data quality with omnichannel and schema, Howser commented, "We allow users to dump all that data into one schema." By doing this, a user is able to normalize data through iterations in Hadoop. As you provide access to more channels, a marketeer is able to draw from mountains of data in one common repository. A direct result of this is a marked decrease in the time required to cull and normalize data, allowing the user the ability to interact with it in a more timely and meaningful way.
Howser acknowledged there would naturally be at trade-off between accuracy and performance. He stated this could be mitigated by building in models that allow you to decide that the more rapid analysis would be good enough. Aiming for an 80 percent accuracy rate would, in his opinion, justify the expedited analytics. "The element of time...the time it takes to define something that rigid takes months," Howser said. That span of time is not practical for the operation of a business or organization. "Let's fail fast. Do a lot of iterations. You set some sort of confidence in this particular application."
Vellante commented how the role of CDO is only a decade old. When considering the role of CDO is to try to achieve information quality, and with Hadoop effecting a change of that model, Vellante queried Howser asking, "How do you see the notion of information quality adapting?" Howser responded the CDO should really have an unparalleled command of the entire business. The CDO should be able to accelerate what the business is trying to accomplish by applying quality information to help the business solve its problems.
The MIT CDOIQ Symposium is being held on the campus of MIT in Cambridge, Massachusetts. Cambridge, as it turns out, is also the new home city for Hadapt HDQ. Howser discussed how the move into the heart of the tech corridor has been a boon for recruiting and dissemination of the benefits of Hadoop. "I believe Hadoop is the operating system of big data. I stand behind that." The future maturity of Hadoop is signaled by the rapid growth of Hadept. "What people are engaging us to do is transition from legacy methodologies."
Scott Howser, Hadapt, at MIT Information Quality 2013 with Dave Vellante and Jeff Kelly
@thecube
#MITIQ