Do not lose time and integrate your data in near real-time.
Your data warehouse, or rather your data analytics solution, is increasingly mutating into a data hub for many new processes in the context of digitization. You have to deliver ever more up-to-date data from your data hub. Hardly a second can pass where your data hub is not updated again.
Your problem is that your data warehouse was originally designed for maximum daily loading. Your data management processes have never been designed for updating in near real-time.
Now you are supposed to update your data hub immediately every time there is a change in the data sources without losing much time. But to do this, you need to completely rebuild your data integration event-based. That's a challenge, but thanks to Smart Data Automation, it's not magic.
In order to provide your Data Analytics Solution with data in near real-time, you have to convert your batch-based data integration into an event-based data integration. For this you need additional software, an event hub like Kafka. And you need CDC (change data capture) functions that help you to identify the latest data.
And then you need to reconnect all data sources with these technologies. That looks like a lot of work. It usually is if you're not using Smart Data Automation. With biGENIUS you simply generate your event-based data integration for Kafka (Event Hub) and for example StreamSets (for CDC) automatically. The only thing you have to do is to discover the data sources and use the metadata to remodel your data integration. biGENIUS also supports you here with wizards, so that you don't have to model every data integration yourself by hand.
At this point you might as well think about building a brand new modern data lake with biGENIUS, which can also serve as a data hub for your colleagues, but that's another story.