Overall a well-executed, fact filled technical conference that catered to the needs of advanced MongoDB users and technical experts coming to learn more about MongoDB.
The keynote address delivered by Ben Sabrin, VP Sales for 10gen, presented the idea that the data revolution is changing everything except the underlying database technology. MongoDB represents an significant step forward for databases that are not based on the RDBMS or SQL paradigms. The principal behind MongoDB is to maintain a natural approach to humongous amounts of data. Ben’s main message – MongoDB is a more natural approach to database usage and end user solutions.
In the “Building Your First MongoDB Application” session Ian Whalen of 10gen delivered an informative presentation about the key concepts of the database while building an example application for the audience. This session quickly ramped up those new to MongoDB and provided the basics they would need for the rest of the day”s sessions.
The “Schema Design” session outlined the basics of designing a schema. MongoDB requires a more thoughtful approach to schema design with more focus on the intended use of the data. MongoDB offers exciting possibilities with its collection and document model. Documents are JSON-like data structures which are highly extensible. Documents are stored in collections. In SQL-speak collections are tables and documents are rows within tables. This construct allows the application programmer great flexibility when creating the database design. However with flexibility comes responsibility. For example, there is no SQL-like JOIN capability within MongoDB. So the data schema must be well designed from the beginning.
“Replication & Replica Sets” session was a clear demonstration of the capabilities of MongoDB in an enterprise setting. The ability to replicate and recovery quickly are always a requirement for production environments. MongoDB has a straightforward architectural solution that makes reliable replication possible. The ability to designate replicas by intended use allows for the support both a transaction (write heavy) and reporting (read heavy) environments across replicated databases. This type of capability in current RDBMS products and SAN/NAS disk solutions is often costly and difficult to manage. Out of the box MongoDB provides a flexible replication architecture that is easily defined and has predictable outcomes.
Paul Pederson, Deputy CTO of 10gen, discussed two kernel functions within the database – concurrency and page fault handling. Read/write concurrency with release 2.2 is a new evolutionary step that brings online casino MongoDB more in line with what is expected when compared to RDBMS queries and the need for a read-consistent view. Page fault handling has evolved from an ad hoc approach to a more systematic methodology within the kernel that greatly reduces the performance impact to the overall database, collections and individual quires. Again another evolutionary step that supports an enterprise with big data needs.
Emily Stolfo presented “The New Aggregate Framework” for MongoDB 2.2. The framework is a “pipeline” architecture” which provides aggregation capabilities. A group of new operators has been added. Some examples are below.
They can be strung together much like a bash() command line:$ ps -ef | grep root | sort | tail
In SQL-speak the aggregation framework supports GROUP BY type aggregation in way that does not require map/reduce programming. But like any pipeline architecture the sequencing of the operators is critical to generating an excepted outcome. For example, a $unwind before a $sort could yield a very different result than $sort followed by a $unwind. The pipeline approach and operators is reminiscent of the architecture employed by Ab Initio”s ETL tool circa 2006. Data is streamed via pipes to highly specialized components. It is the programmer/designers task to determine the order and configuration of each component to create the intended outcome. The Architecture Framework provides this same type of modularity, power and flexibility.
The “Ask the Expert” sessions provided a place for “question and answer” interactions with the 10gen technical team. 10gen experts were available to answer any question about MongoDB or a specific design or coding challenge. 10gen did a get job executing this type of exchange in a conference setting. At other conferences all too often you end up dealing with “Marty Marketing” or “Sam the Sales Guy.” 10gen”s Randall Hunt bravely threw himself on my RDBMS star scheme conversion to MongoDB question. While he did not have the answer at his fingertips he promised to contact me with more information. And two hours later he emailed me the documentation I needed. Great job.
It is difficult to execute a one-day conference and serve the information needs of the attendees. 10gen keep the conference focused and relevant. Well done. I hope this continues in future. See you in Chicago!
Today, I met Craig Wortman, master story teller. His ability to demonstrate the powerful impact stories have on business problems is simply amazing. He is not only a master story teller. He can also teach you how use story telling as a tool for presenting Big Data solutions to business users.
“While the technique of telling stories is the oldest form of communication—it’s also the one form that rises above the din of our information-saturated environment and delivers messages in a way that connects with people, bringing ideas to life and making them actionable and memorable.” – Craig Wortman
So why is this important? As business intelligence and big data professionals we are horrible story tellers. While striving to create data visualizations , performance management dashboards or develop a “360 degree” view of the customer we get lost in the data and forget to tell an online casino actionable, memorable story.
Frankly, I fall into this trap all the time. I put so much effort into the solution I never think of the story. What story should I tell the business user seeing the data visualization for the first time? What story do I tell my technical colleagues about the challenges encountered to reach the solution? What is the “movie trailer” version of your story (see Craig”s book “What”s your story?“)
Each Big Data solution deserves a great story that gabs people emotionally. The story has to go beyond the data or the manner of data presentation. The amount of raw data being analyzed each day is growing exponentially. Each day brings new techniques, tools and disciplines to the Big Data space. But without a Big Story as a starting point the value of Big Data will remain a mystery to managers and executives.
An actionable and memorable Big Story is critical component of understanding Big Data. What is the story behind your solution? Why is your solution compelling? I want to know.
With the passing of Mr. Bradbury and the 50th anniversary of his book, Fahrenheit 451, I send the passage below to all of us in the Big Data and Data Science community. Motion is not always progress. And use plenty of the slippery stuff in your work.
“Cram them full of noncombustible data, chock them so damned full of’ ‘facts’ they feel stuffed, but absolutely ‘brilliant’ with information. Then they’ll feel they’re thinking, they’ll get a sense of motion without moving. And they’ll be happy, because facts of that sort don’t change. Don’t give them any slippery stuff like philosophy or sociology to tie things up with. That way lies melancholy.
“Any man who can take a TV wall apart and put it back to together again, and most men can, nowadays, is happier than any man who tries to slide-rule, measure, and equate the universe, which just won’t be measured or equated without making a man feel bestial and lonely. I know, I’ve tried to: to hell with it.”
“… Lecture’s over. I hope I’ve clarified things. The important thing for you to remember, Montag, is we’re the Happiness Boys, the Dixie Duo, you and I and the others, We stand against the small tide of those who want to make everyone unhappy with conflicting theory and thought. We have our fingers in the dike. Hold steady. Don’t let the torrent of melancholy and drear philosophy drown our world. We depend on you. I don’t think you release how important you are, we are to the happy world as it stands now.”
Ray Bradbury, Fahrenheit 451, 1973
I am currently engaged with an exciting project. There are several Big Data technologies that might be used in the future state solution. However, the exciting part is assembling the data governance requirements. My hypothesis – where is the intersection of ethics and big data. I think it is an important question. Especially as hardware/software combinations become so powerful that we could very quickly aggregate ourselves into an ethical dilemma.
Stay tuned …
Recently Marek Koenig, Slalom Consulting, proposed an interesting approach to increasing the ability for “data end users” to understand ERD diagrams. Often “data end users”understand the company”s tranactional data quite well. However when the data is modeled in an ERD and presented to the “data end user” they often have to work their way back to what they already know!
He applied his creativity blackjack online to the problem and proposes a data visualization method that would allow “data end users” to view an ERD with color coding to indicate origin of each field. Read his article for more information - A New ERD.
Recently the McKinsey Global Institute put forth the following projection about Big Data and the impact on corporations.
“Over time, we believe big data may well become a new type of corporate asset that will cut across business units and function much as a powerful brand does, representing a key basis for competition. If that’s right, companies need to start thinking in earnest about whether they are organized to exploit big data’s potential and to manage the threats it can pose. Success will demand not only new skills but also new perspectives on how the era of big data could evolve—the widening circle of management practices it may affect and the foundation it represents for new, potentially disruptive business models.” - McKinsey Global Institute
The article addresses five business strategy and tactical questions.
- What happens in a world of radical transparency, with data widely available?
- If you could test all of your decisions, A Betsson megnyerte “Az Ev online casino Sportfogadas Operatora” cimet az International Gaming Awards galan, Londonban 2012 Januar 23-an. how would that change the way you compete?
- How would your business change if you used big data for widespread, real-time customization?
- How can big data augment or even replace management?
- Could you create a new business model based on data?
Ultimately each corporation”s IT group must work alongside with their internal business customers to maximize the possibilities of Big Data. The pressure to evolve is becoming enormous. For IT professionals the challenge is to get out in front of the business with the understanding that without preparedness costly, unproductive data silos will be created. Now is the time to learn and lead.
On Monday, Feb 13, I presented “Big Data – New Frontiers for IT Management” to the Association of Information Technology Professionals (AITP) in Chicago, Il. It was a lot of fun. AITP is a unique organization in that its membership includes senior IT professionals, technologists of all types and a college student organization all focused on social networking the fashioned way – face to face.
I want to thank the Association for the opportunity and the wonderful dinner. I loved the question and answer / “what did we learn” session that followed falsgame my talk. I especially want to thank Denny Macumber and Gary Czubak of AITP for their help in making this evening a success for everyone.
As promised here are the links to the reference material requested.
“Big data: The next frontier for innovation, competition, and productivity“ - McKinsey Global Institute, May 2011
For the corporate executive, business manager and information technologist data is now in the driver seat. The “Age of Big Data” is upon us.
The purpose of this blog is to explore Big Data and Business Intelligence architecture. At BigDataArchitecture.com architecture is an inclusive term spanning technology solutions, analytics, data scientist, open source – Hadoop, and commercial products. The goal is to create a open forum that is as relevant to the IT professional as it is to business manager dealing with the flood of data. To that end postings will contain slotsies.com analysis and reference material designed to help those currently dealing with Big Data and those wanting to learn more. This will include articles, expert guest bloggers and one-of-one podcasts with industry leaders.
For college students and “career changers” there is a great deal of opportunity in the Big Data world and simply not enough people today to do the work – Today. BigDataArchitecture.com is committed to bringing relevant information about Big Data careers.
“Data is a new class of economic asset, like currency or gold.”
- World Economic Summit, Davos Switzerland, 2012.