Are data mindsets helping or hurting system integration?

If you’re like me and you’ve been doing software development and system integration work for a while, you remember the era before big data. Data warehouses and data marts were a thing for quite some time but unless you worked with those groups or ETL tools specifically, then data might not have played into the conversation much. Everything was about services and objects. SOAP and RPC based remoting technologies also encouraged some abstraction which, to varying extents, meant we did not call it data.

As dynamic languages, schemaless databases, big data tools like Hadoop, and REST all became more popular from 2008ish on (coincidence?), the world stopped being as focused on object/class/service design when integrating systems and more about exchanging data. It might largely be a semantic difference. I’m not sure. Sometimes I get the feeling though that we’re trying to abstract/encapsulate less than we did a decade or so ago and rather than having a reasonable focus on tell don’t ask.

The problem then, if that’s what we’re doing, is that we’re going backwards towards a pre-RPC model of system integration, where integrating with another system meant copying data to/from it or heaven forbid, reaching into its own database and making the schema a contract between systems. On the other hand, it’s also possible that improved analytics capabilities from storing all of that data some place plus some of the above technologies are also driving better vertical application and system integration requirements. I haven’t seen a huge change in my projects but I also might have a limited sample set.

My recommendation when brainstorming system integrations for passing data from A to B, is to consider events and context about what is happening to “the data.” You may find that the REST resources you build end up representing something different than what you started with.

Brady Wied