There is an all-around push to the cloud due to massive simplification of resource management. If done well, it leads to massive costs savings as well. There are many concerns of the security and unforeseen weaknesses of the cloud application deployment  - not to mention cloud database deployment where data is leaving the boundaries of the company.

Well, on-premise isn't exactly a safe haven. Anything can be hacked. Clicking on a link in a legitimate-looking email can have a backdoor installed where nobody notices ill effects for a long time and hackers have a free reign on our local network. When you hear that casino has been hacked via a fish-tank thermometer, you know that anything can happen to you, too. I know of one local IT company that kept all its source on a local Gitlab server, secure. Recently, they had the misfortune of being targeted by a crypto-malware - they lost all the source code, all the internal Wiki pages with years-worth of specific procedures dealing with infrastructure of each client. They managed to find a backup, not so fresh, and are patching things since. Of course, it was a series of unfortunate events - some internal backup processes were broken due to departures of certain people. Still, it happens and it can happen to all of us.

Basically, anything can be hacked, providing there is sufficient motivation. Even the script-kiddies running ready-to-use scripts they find on the web can make your life miserable even though they might not even understand the inner workings of the scripts they are running. Trove of data on the Internet can be readily used and misused - and it is.

The cloud infrastracture's security is state of the art. Generally, cloud providers have extremely capable engineers whose sole task is to make a bulletproof solution. Not many companies can afford that. That said, there is plenty of opportunities to shoot yourself in the foot with the cloud infrastructure.

While cloud greatly simplifies deployment of the infrastructure, as well as maintenance, it still uses the same building blocks we'd use locally. You need to setup your network, storage, databases, applications, as well as wrap your stuff in the containers. You might be able to get rid of containers via a fully serverless setup, but we found that approach lacking for our use-case. Now, a lot of things are significantly simplified. You don't need to run around with the hardware, you don't need to replace disks. A ton of maintenance has been lifted off of your shoulders.

One source of the trouble in the cloud setup is not knowing the limitations of the services you've been given. Nothing is for free. All that scaling and uptime guarantees come with well-defined boundaries of what you're allowed to do. For example, AWS DynamoDB will handle anything you throw at it - unless you didn't purchase enough throughput so your requests are throttled or discarded. Or you might liberally throw some secondary indexes in and be unpleasantly surprised when your bill starts showing a multiple of the expected costs.

Not knowing what you're doing will byte you in the on-premise setup and cloud setup in exactly the same way. Everything should be locked down by default. Only services that should be communicating should see each other. That approach should be followed religiously. For example, if you leave your ElasticSearch server open to the world, world will eventually notice and start grabbing your data (and your customers' data).