Hello! I need a guide on how to migrate data from shared hosting to Docker. All the guides I can find are about migrating docker containers though! I am going to use a PaaS - Caprover which sets up everything. Can I just import my data into the regular filesystem or does the containerisation have sandboxed filesystems? Thanks!
Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:
Fewer Letters | More Letters |
---|---|
DNS | Domain Name Service/System |
VPN | Virtual Private Network |
k8s | Kubernetes container management package |
[Thread #104 for this sub, first seen 3rd Sep 2023, 01:05] [FAQ] [Full list] [Contact] [Source code]
fmstrat@lemmy.nowsci.com 1 year ago
I’ll try to answer the specific question here about importing data and sandboxing. You wouldn’t have to sandbox, but it’s a good idea. If we think of a Docker container as an “encapsulated version of the host”, then let’s say you have:
Service A
running on your cloudapt-get install -y this that and the other
to run/data/my-stuff
Service B
running on your cloudapt-get install -y other stuff
to run/data/my-other-stuff
In the cloud, the
Service A
data can be accessed byService B
, increasing the attack vector of a leak. In Docker, you could move all your data from the cloud to your server:You’re
Dockerfile
forService A
would be something like:You’re
Dockerfile
forService B
would be something like:This makes two unique “systems”. Now, in your
docker-compose.yml
, you could have:This would make everything look just like the cloud since
/local/server/data
would be bind mounted to/data
in both containers (services). The proper way would be to isolate:This way each service only has access to the data it needs.
I hand typed this, so forgive any errors, but hope it helps.