Many enterprise systems today, especially those that are cloud-based, have resource-oriented APIs that can be accessed over HTTP, which allow developers and system integrators to connect them to line of business software with relative ease. As the Laserfiche ecosystem grows, it is becoming more and more clear that one of the things holding it back is the lack of such an API. And frankly, it is confusing and somewhat concerning that there is nothing on the roadmap to develop one, at least as far as I'm aware.
Let me give a simple "real-life" example from just this week. We're working with a customer who uses a third-party cloud-based system to generate sales quotes as clean, professional-looking PDFs. The customer asked us if it is possible to store these PDFs automatically in a specific folder in Laserfiche, along with some metadata (template name, salesperson name and branch, and so on). The third-party system in question has webhooks that allow the generated PDFs to be sent elsewhere using HTTP POST, which allows the system administrator to easily configure a destination that can accept such requests and process them. The idea is that once a quote gets signed and accepted by the client, its PDF can then be sent to another system for archival.
The problem is that Laserfiche doesn't have anything that can ingest such requests. Getting it to do so would require writing a web service that utilizes the Laserfiche SDK behind the scenes, and writing and maintaining such services requires proficiency in .NET/C#, as well as familiarity with the service-oriented architecture of the LF ecosystem. Furthermore, there is no convenient method to authenticate the requests, so one either needs to be developed from scratch (LFDS doesn't have an open, standards-based protocol that can be consumed by non-LF applications, as far as I'm aware), or the web service needs to use the hard-coded credentials of a named user that has access to the destination repository.
The ideal solution would be a Laserfiche product/module (let's call it "LFAPI" for the sake of simplicity) that can be installed on a web server and configured to route requests coming to various endpoints over to their corresponding LF services. For example:
Fetch a document/folder: GET https://mydomain.com/lfapi/repositories/<repo_name>/<entry_id> Store a document: POST https://mydomain.com/lfapi/repositories/<repo_name>/ (include document info in the request body, such as the image, metadata, options, etc.) Update document metadata: PUT https://mydomain.com/lfapi/repositories/<repo_name>/<entry_id> (include desired updates in request body, e.g. metadata fields and values) Invoke a workflow: POST https://mydomain.com/lfapi/workflows/<workflow_id> (include input params in the request body) Get the status of a workflow instance: GET https://mydomain.com/lfapi/workflows/<instance_id>
For authentication, you should be able to tie API Keys to user accounts, then include one as a bearer token in the request header for the LFAPI service to check and authenticate/authorize. Again, these types of authentication systems are super common; you get the idea.
To conclude, we always talk about how Laserfiche is easy to integrate with other systems, but I think everyone knows that that's just a half-truth — one that is becoming less true as time goes by. Sure, Workflow can send web requests, and read from and write to database tables and so on. But truly robust systems require two-way integrations, which means LF needs to be able to easily receive documents and data from outside as well. And currently the mechanisms for that need to be custom-developed, which makes them expensive and usually very clunky too because you always end up fighting the existing architecture and working around its design (not to mention licensing!) limitations. But if an "LFAPI" product existed — one that essentially acted as a unified gateway between Laserfiche and the outside world — it would immediately make Laserfiche a much, much more viable middleware piece in complex systems.