Has anyone used Amazon S3 Glacier and S3 Glacier Deep Archive to a secure there Laserfiche Data? If so, could you please share information on how you did it.
Question
Question
Amazon S3 Glacier - Secure Laserfiche Data
Replies
Hi Olivia,
Is your inquiry related to the one @████████ made here?:
Amazon AWS S3 Glacier \ Glacier Deep Archive Storage Class
Laserfiche does not directly support S3 for repository volume storage. S3 Glacier Deep Archive in particular can take up to 12 hours for retrieval and is thus unsuitable for live volumes. You can certainly store backups of your Laserfiche data in Glacier/Deep Archive though.
If you can share more information on the business or technical objective you were hoping to achieve by using S3 Glacier/Deep Archive, perhaps I can suggest a different solution. It would also be helpful to know if your Laserfiche system is already hosted in AWS.
Best,
Sam
It's been a few years, I wanted to see if anything has changed here. Also, is it just the s3 Glacier that's not supported, or all s3 storage?
Hi Michael, no change to object storage (S3/Azure Blob/etc.) support for self-hosted Laserfiche systems. It's not currently on the self-hosted roadmap.
I've made at least one reply on Answers that describes potential options for using 3rd party SMB to object storage interfaces for repository volumes with self-hosted systems. You can find that here.
Laserfiche Cloud now uses S3 to store repository content.
Hi Sam,
I was looking to see if anyone had completed the process already; if they had a script for the vault lock policy and if they had any big issues throughout the process.
You mentioned Vault Lock. Does that mean you're looking for immutable/WORM storage in AWS? Is there a specific compliance requirement you're trying to meet?
Because Laserfiche does not directly support storing content in S3, we'll need to know more about why you're looking into this in order to effectively assist.