Integrating DuckDB with FileMaker (Part 2)

26. November 2024

Integrating DuckDB with node.js as a REST API for FileMaker

2nd Edition of the DuckDB & FileMaker topic, as presented in Malmö at EngageU 2024 conference.

Please find attached Slides and Demo-Files as well as references for further reading.

After exploring the technical basics of integrating DuckDB with FileMaker in the first presentation, we are going to take it a step further this time.

The revised demo file contains several examples of accessing local and external data sources in a variety of formats. We read and write CSV and Excel. We scan local directories with wildcards and access data files in the cloud. We query REST APIs with authentication and explore creative edge cases, such as reading data from a web page with a simple SELECT statement.

After we have tried out the very flexible access options, we will turn our attention to the topic of performance again. Using a real-world use case, we will compare the enormous speed of DuckDB with the native options in FileMaker.

Finally, you will learn some best practices for setting up an ETL process from FileMaker, where DuckDB prepares the data for use in analytical dashboards. A demo case with Evidence shows how to easily integrate a dashboard into the FileMaker solution that generated the data.

This presentation offers a balanced mix of background information on the strengths of DuckDB and shows practical approaches for using potent open source tools to supplement the toolbox within FileMaker business solutions.

With the included information and the attached demo file, you have a good starting point for your own experiments. It’s best to get started right away!

The presentation slides are available for download here:
EngageU Presentation – Integrating DuckDB (PDF 15,7 MB)

Demo files (FileMaker and node.js) are available for download here:
DuckDB_Demofiles_v2.zip (ZIP 197 KB)

Further reading:

EngageU Presentation: Integrating DuckDB with FileMaker

If you solved the first steps and want to dive deeper into the topics of DuckDB, ETL and Dashboards, the following list will help you find further resources.

References: DuckDB

References: DuckDB behind the Scenes

References: Dashboards

EngageU 2024 European FileMaker Conference

23. November 2024

EngageU 2024

Recap of the #EngageU conference, which took place in Malmö/Sweden from Nov, 17th to 19th.

Thanks to the engagement of the independent developer companies Square Moon and ClickWorks, who have joined forces to prepare and present another high-profile event for the Claris FileMaker Community, that was a real highlight to round off 2024 in terms of events.

200 attendees from 25 countries came to Malmö to network and exchange ideas – and, of course – to attend some of the 37 sessions, trainings and panels offered in 4 tracks during the two main conference days.

Not only the community showed up but also Claris, as a main sponsor, joined to start the conference with an official keynote and some interesting strategical and technical news.

The conference team, represented by Johan Hedman and Joris Aarts, held the opening session, talking about their motivation and presenting some impressive numbers about the event.

Followed up with the Claris Keynote by Ryan McCann, the new Claris CEO, with a promising outlook about the ongoing transition of the platform. (You can find out more in the latest episode of The Context Podcast: “Updates from Claris with Ryan McCann“.)

More technical talk afterwards by Andrew LeCates, who recently had his 40 years career anniversary inside Apple, showing some great usecases illustrating the ‘Workplace OS’ approach of the FileMaker platform.

Wesley Powell got even deeper into technical backgrounds of server architecture, regarding scalability, multi tenency, security, microservices, kubernetes containers, Kafka, MongoDB, etc.

EngageU 2024

Key messages from Claris:

  • Claris is partnering with Apple to expand market share for the Claris products
  • The platform is evolving in regards to revenue and new features
  • Claris Studio and Connect got mature and are serving as Extensions to existing FileMaker solutions
  • AI and Semantic search bringing new capabilities to work with existing data
  • New documentation and learning resources are provided to lower the entry level for new and existing developers

The full conference agenda offered a wide range of topics, from highly specialized technical content to interesting real-word use cases, business topics and presentations of state-of-the-art developer tools.

A central topic was once again AI in various contexts. This complemented nicely the demand for practical use cases and technical capabilities that are supplied by the new FileMaker versions.

If you are interested in more details, have a look at the conference website:
https://engageu.eu/agenda-2024/
The EngageU Team promised to share presentation materials and session videos later. So you can use the rich spectrum of expertise for your own needs.

Thanks to the conference team, to all sponsors and anyone who was involved for making this great event possible!

EngageU 2024

PS: A summary of my own presentation “Integrating FileMaker and DuckDB” is available here on my blog.

Building a REST API for DuckDB

11. Juli 2024

REST API for DuckDB

A step by step guide

Create a REST API using Node.js for remote access of DuckDB. This makes it very easy to embed DuckDB in other applications, such as FileMaker.

With the included demo files (refer to bottom of this post) you can create a running test environment on your local machine within a few minutes. It is of course also possible to install the service on a separate machine and access it via HTTP request. This can be useful to easily integrate DuckDB into an existing production environment as a microservice.

All the necessary steps are explained in the following instructions.

Please note that there is an updated demo file available for download (if you tried out before Nov, 2024).




1) Installation of node.js

Download Prebuilt Installer from the official site:
https://nodejs.org/en/download/prebuilt-installer

Run installer on your machine.

After node.js was installed – check whether the application responds inside your Terminal.

Query version:

node -v


2) Install demo file for API:

Download zip-archive with the demo files.

Create a folder “nodejs” in the user directory and copy the folder “duckdb-rest-api” into it.

Now change to the directory with the demo file:

cd /Users/yourname/nodejs/duckdb-rest-api/

Please put your user name in the colored position!

Install and query the current version of nmp (this step is optional because npm is usually installed by the nodejs installer):

npm install npm
npm -v

Now you are ready to install the needed modules for our project.


3) Install/Update the required modules via npm

Make sure that you are in the “duckdb-rest-api” folder. Then run the following command in your Terminal:

npm update

npm will lookup the package.json file for any needed modules and install or update them in the “node_modules” folder. This should work automatically. Since there are different binaries required for different platforms, the node_modules folder is supplied empty when you extract it from the zip-archive. This process ensures that all modules are installed/updated in the correct way for the platform you are using.

After the npm update command is finished, you will find a few log lines ending with a line similar to this:

added 188 packages, and audited 189 packages in 9s

The “node_modules” folder that was empty before is now updated and should contain a bunch of subfolders with all installed modules (about 120 MBytes).


3 b) Install the required modules manually (optional)

If you are having trouble with npm updating the “node_modules” folder, you can use this method to load the modules for a fresh installation:

npm install express
npm install duckdb
npm install sql-formatter


4) Start the API script

Change to the “duckdb-rest-api” folder (if not already done with the above steps) and run the script file:

node duck_v2.js

node is the call to the Node.js engine
duck_v2.js as parameter is the name of the API script file

A message should now appear in the terminal indicating that the API script is running and the local web service has been started:

Server Listening on PORT: 3000


Please note – there a two versions of the script-file available:

  • duck.js Version 1.5 with basic functionality
  • duck_v2.js Version 2.6 with enhanced functionality:
    • Use a named database file for persistence (optional)
    • If no named database file is given, DuckDB will be used with ‘in-memory’ DB by default
    • GET endpoint ‘/version’ for version info
    • POST endpoint ‘/format-query’ for SQL syntax coloring


5) Test query of the API in the browser (optional)

http://127.0.0.1:3000/status

{“Status”: “Running”}


6) Open FileMaker file

Locate the provided demo file DuckDB API.fmp12 and open it with FileMaker Pro. You will find some basic examples to explore the possibilities of DuckDB.

You might study the included scripts and techniques to understand how it works. The file is unlocked and provides more information inside the ‘Read me’ script.

The FileMaker file communicates with the API via the contained script and sends the entered queries via ‘insert from URL‘ to the address ‘http://127.0.0.1:3000/‘. The responses are decoded accordingly and displayed in the given result field.



FileMaker DuckDB Demo file available for download



That’s it!
You are now able to run the demo requests and start your own experiments.

If you are not using the FileMaker demo file but want to integrate DuckDB into your own environment via the REST API, you can do this by simply sending POST requests to the specified local host URL on port 3000.

Have a look at the duck.js script file where you can find basic documentation about the supported request scheme.

Node.js REST API script is available for download

It might be helpful to send your requests with Postman or a similar tool to understand the structure of the API communication via JSON messages. And to send your own SQL queries to the DuckDB engine.




If you want to deploy the setup on a separate computer or VM, it makes sense to create an autostart script for the REST API. Please refer to the following plist file to make it work via LaunchAgent on MacOS. For other platforms, a similar setup is required in the appropriate way for the respective operating system.

Same is applicable if you want to autostart the script on your local machine.


7) Set up LaunchAgent for Autostart

mkdir -p ~/Library/LaunchAgents
nano ~/Library/LaunchAgents/com.username.nodescript.plist

This creates an empty plist file in the users Library folder with the name ‘com.username.nodescript.plist’ and opens the empty file with nano editor.

Now you can paste in and modify the following XML snippet:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
    <key>Label</key>
    <string>com.username.nodescript</string>
    <key>ProgramArguments</key>
    <array>
        <string>/usr/local/bin/node</string>
        <string>/Users/yourname/nodejs/duckdb-rest-api/duck_v2.js</string>
    </array>
    <key>WorkingDirectory</key>
    <string>/Users/yourname/nodejs/duckdb-rest-api/</string>
    <key>RunAtLoad</key>
    <true/>
    <key>KeepAlive</key>
    <true/>
</dict>
</plist>

Make sure to put your user name in the colored positions! The LaunchAgent config will not work if you only enter a shorthand for the user directory such as ‘~/nodejs/duckdb-rest-api/‘.

If the plist file with the contained xml snippet is present, MacOS starts the nodejs script file automatically on reboot and ensures that the working directory for the script is set correctly.

The script is executed in the background without a visible terminal window. You can check if the API is available by looking in your browser – see step 5) above.

Start Launch Agent manually:

launchctl load ~/Library/LaunchAgents/com.username.nodescript.plist

Stop Launch Agent manually:

launchctl unload ~/Library/LaunchAgents/com.username.nodescript.plist



Demo files

DuckDB API.fmp12



Demo files (FileMaker and node.js) are available for download here:
DuckDB_Demofiles_v2.zip (ZIP 197 KB)



Further reading:

FileMaker and DuckDB – what’s special about this liaison?

15. Juni 2024

FileMaker + DuckDB

After discovering DuckDB by chance and giving it a try it was time to get a real project done.
I did some experiments and found a way to integrate DuckDB with FileMaker, what was the topic of my presentation for the dotfmp conference in Berlin two weeks ago. After the resonance was quite exciting and feedback by other FileMaker developers was encouraging, I gave it a try to adapt an existing solution to the possibilities of DuckDB.

Get ready for a deep dive

In my first experiments with DuckDB I just touched the surface. Now it was time for a real deep dive…
A few months ago I implemented a project with FileMaker to solve a strategic question in a business case that was unanswered yet.

It was a really complex scenario for analyzing a huge dataset. The approach includes

  • unpacking nested groups of labels
  • normalizing multiline keys
  • aggregating on missing relational data
  • generating data on if-then-else logic
  • calculation of statistical weights upon aggregated results of the whole dataset
  • normalizing the weights upon subsets of the data in several dimensions
  • and finally extrapolation of the whole dataset by a cross product of labels, weights and several measures

The original analysis I created in FileMaker required a rather complicated logic that included

  • generating meta data by scripts
  • multiple loops upon all dimensions and millions of records
  • calculation of intermediate results
  • writing back status flags to the data for being able to filter aggregations

When it finally worked, it took more than 10 hours of processing time. The final results then had to be manually transferred into a huge Excel document to display multiple pivot tables with the results in different dimensions.
The impact of the results were really worth it. But now the decision makers wanted to have this report on a regular basis.

This was an ideal candidate to find out if DuckDB could fulfill the promises I had read about. It took me some days to understand the logic of the approach to this task and to discover more and more of the rich toolset DuckDB offers. Finally the new project got traction and I learned some really neat tricks that were only possible by the powerful and flexible SQL language that is unique to DuckDB and extends the classic SQL by some awesome features like logic for structures, conditionals, nesting, pivoting, regular expressions, string functions, date functions and much more.

Unleashing the power

To make a long story short: Not only was it possible to rework the entire project within a short period of time, but the end result was far more advanced than I could ever have dreamed of.

FileMaker DuckDB
12 Tables, 25 TOs
86 Helper Fields
10 Custom Functions
56 Scripts
1524 Script-Steps
1 Query
27 SELECT Statements

Using DuckDB, it was possible to turn all the data processing into one giant SQL statement consisting of a series of SELECT commands to import all the existing data from CSV files, generate the calculated data and output the results in exactly the form that was needed. Thanks to its in-memory capabilities, the processing time has been reduced from 10 hours to less than 4 seconds(!!).
Really amazing!

FileMaker DuckDB
10 hours 4 seconds

With this power under the hood, I finally went one step further and had DuckDB generate a new dataset consisting of a cross product of the original data with the sliced subdivisions of labels and weights that were calculated as a temp table during the aggregation process. This new dataset consists of about 30 million rows of data, and it only took less than a second to create it, and less than 3 seconds to write 2.8 gigabytes of data to a new CSV file. That said, I did not use a super computer or some scaled out cloud service, but all this happened on my local machine, a MacBook Pro with an M1 CPU.
Awesome!

Go one better…

Now, with the new result set, there is a foundation to build another project on top of this processing pipeline to bring further enhancements to our existing reporting. I had built an OLAP tool in our FileMaker solution, to make sense of transactional data by being able to filter and aggregate by more than 20 dimensions. But since we collected about 25 years of transactions, which means tens of millions of FileMaker records, the processing time in FileMaker got slower and slower over the years. To compensate we migrated the server hardware every few years to the fastest machines that were available at that time.

A typical report needs some minutes to select a subset of the data by find-request and then doing the calculation of the results with summary fields and sorted subsummary-layouts. Which is indeed a nice and flexible way of doing this in FileMaker. But as processing time grew over the years, these powerful features are used less and less by the users who are in need for the results to make business decisions on the data. We ended up creating some scripted reports that are processed overnight once a month and automatically sent to our controlling team as a PDF file.

Now we see some opportunities to advance our OLAP tool for “real time” analytics with the help of DuckDB. It could then produce final results instantly by the push of a button in the FileMaker user interface. That would be a huge advancement in delivering an interactive tool to find answers on strategical questions on the fly.

It comes down to the simple fact, that big CSV files with shadow tables of all data can serve as a database by its own, as DuckDB is able to process the files in blazing speed. It would be nice to bring the integration of DuckDB and FileMaker some steps further for more elegant interactions but the discovered possibilities by REST API are fair enough to unfold the magic of analytical power and build new ways of interaction with the business data that already exists inside the FileMaker solution.

Conclusion

FileMaker is a great tool that enables us to build critical business solutions by storing and managing data. This is why we often call it a “Database”. In fact, we know it is much more than that, but it has its roots in this category of database software. If you think of DuckDB as just another database, you will miss the point. Yes, it’s true that DuckDB also is in the category of database software, but it has a whole different approach on dealing with data.

While FileMaker is optimized for reading and writing rows of data. DuckDB is optimized for processing and aggregating columns of data, resulting in its unbelievable speed. While FileMaker is optimized for transactions and access by multiple users on a server, DuckDB is optimized for single processes on a single machine, without any features for record locking or network access from multiple users or devices. It just reads, transforms and writes data in very flexible and powerful ways. So each of this two databases has its very own feature set that are ideal in combination because they complement each other nicely.

FileMaker and DuckDB is a winning team! There are many more use cases to be discovered, the journey has only just begun.


Further reading:

dot.fmp 2024

10. Juni 2024

dotfmp 2024

Vom 5. bis zum 9. Juni fand in Berlin die dotfmp Konferenz statt. Knapp 80 Teilnehmer aus 18 Ländern trafen sich erneut zum Austausch über aktuelle FileMaker Themen in den Konferenzräumen der Schankhalle Pfefferberg. Traditionell begann der Auftakt bereits am Mittwoch Abend mit einem Treffen im Prater Biergarten.

An den 3 Konferenztagen wurden parallel in zwei Tracks insgesamt 39 Sessions mit englisch-sprachigen Vorträgen, Diskussionen und Workshops angeboten. Ein Schwerpunkt-Thema war diesmal der Einsatz von KI. Dazu gab es konkrete Fallbeispiele und Anleitungen, wie sich KI in FileMaker-Lösungen integrieren lässt. Sowie ergänzend spannende Hintergrundinfos über LLMs, Embeddings, Semantische Suche und einiges mehr. Aber auch ein bunter Strauß an anderen Themen u.a. zu Claris Studio, Webviewer, Continuous Deployment, Load Balancing, Servermonitoring, Let’s Encrypt, fmeIDE, DataAPI, OData, Inspector Pro, MBS, JSON, FMBetterforms, Security Audits, React, Error Handling, Technical Debt, Logging, DuckDB, Best Practices, uvm.

Beeindruckend war das durchweg hohe Niveau der Themen und Vorträge. Daneben gab es reichlich Gelegenheit, miteinander in den persönlichen Austausch zu kommen. Das bunte Berliner Kneipenviertel rings um den Veranstaltungsort bot vielseitige Möglichkeiten, am Nachmittag und Abend in kleinen Gruppen auszuschwärmen.

Genau diese Kombination aus professionellem KnowHow und der sehr offenen und fast freundschaftlichen Atmosphäre unter den Teilnehmern ist seit 12 Jahren das Erfolgsrezept der dotfmp. Entsprechend wehmütig war die Reaktion auf die Ankündigung seitens des Veranstalters, dass es keine Fortsetzung der dotfmp mehr geben wird. Mit Standing Ovations für Egbert Friedrich und einer spontan organisierten Abschieds-Session mit Foto-Rückblick durch die vergangenen Jahre ging die Veranstaltung am Samstag sehr emotional zu Ende. Ein echtes Privileg, für alle die dabei waren, noch ein letztes Mal in den Spirit dieser besonderen Konferenz einzutauchen.

Danke Egbert!





dotfmp Berlin
https://dotfmp.berlin/welcome

Podcast “Claris Talk AI”
Interviews with Experts at DotFMP Berlin
u.a. Interview mit Matt Navarre und Egbert Friedrich
https://youtu.be/qIlK6PgdsQ4
https://overcast.fm/+BHFoSWCrd4

Integrating DuckDB

08. Juni 2024

Integrating DuckDB with Node.js as a REST API for FileMaker

…with Node.js as a REST API for FileMaker

Please find attached Slides and Demo-Files from my presentation at dot.fmp conference in Berlin.

DuckDB is a rising newcomer on the block. It has a small footprint, is blazingly fast and runs on all platforms. With its extended SQL commands, it goes beyond other database engines and offers an interesting feature set. Optimized for storage in columns, it offers a comprehensive toolset for fast data analysis. With its ability to read data from multiple sources – even directly from cloud storage – it can be used as an ETL tool.

We have a quick look at DuckDB and examine how to integrate it’s power into FileMaker solutions. As an additional benefit you learn how to use node.js to create a simple REST API that is instantly accessible with just a few lines of FileMaker script.

The presentation slides are available for download here:
dotfmp Presentation – Integrating DuckDB (PDF 2,1 MB)

Demo files (FileMaker and node.js) are available for download here:
DuckDB_Demofiles_v2.zip (ZIP 197 KB)

Further reading:

Claris Platform Update – The Bottom Line

14. Oktober 2023

FMDiSC Claris Platform Update

The latest news update from Claris has made some noise. Now that the dust has settled a little, it is time to look at the various aspects of a major transition that has now been going on for more than 4 years.

Let’s take a look back at the major adjustments in the Claris strategy and why they were necessary. We provide answers to the latest developments and put them in a broader perspective.

How Claris has managed to clarify its strategy for the entire platform. We examine which issues have been resolved and where action is still needed.

At the invitation of the colleagues from FMDiSC, I was able to give another presentation, picking up where last year’s presentation ended.

Claris Platform Update – The Bottom Line
FMDiSC October Meeting 2023
https://youtu.be/tc2f2MNylcQ?t=5466

The Future of FileMaker – Position determination
FMDiSC October Meeting 2022
https://youtu.be/2FQ1iE1aDAI?t=137

Claris Product Strategy Update 2023

30. August 2023

Claris platfrom update

There has been some speculation about the future of FileMaker over the past 2 years. In the meantime, after numerous announcements on the part of Claris, there were more and more concrete results that gradually became visible. Some things looked promising, in other places there was still a lack of clarity, and some things were less than optimal in implementation and especially in communication.

What was positive, however, was that Claris, after initial hesitation, actively went into dialogue with partners and developers. Gradually, results were presented and feedback was obtained again and again. In the meantime, one can definitely say that Claris has listened and taken the suggestions and concerns of its long-term partners seriously!
Of course, it is not always easy, especially with such a reorientation, to get all the right goals under one hat at the first attempt. It is all the more important that during a major transformation there is room for corrections and a re-sharpening of the strategy.

Now Claris has made some changes: Many suggestions for improvement have been taken into account and the bottom line is that the waiting time and the intensive exchange with the community have been worth it. Although not all goals have been achieved in the implementation yet, it is now clearer than before that the direction is right! Moreover, Claris has already delivered results in some places. The most recently released versions make a solid impression.

And most important: while the new products “Studio” and “Connect” are given a meaningful place in the platform strategy, existing solutions can benefit from the innovations without the need for an elaborate technical migration. The best of the new and old worlds simply grow together seamlessly.

In addition, Claris now gives all existing customers and developers free access to the new tools so that they can prove themselves in practical tests. We can “kick the tires” without any additional licensing hurdles or technical barriers. So things can finally move forward!

Refer to Claris latest announcements from August 2023:
Blog: Claris Product Roadmap
YouTube: Claris Product Strategy Update 2023


Let’s look back at the most important points from last year’s presentation “The Future of FileMaker”. There, we had summarised some issues and wishes from the customers’ and developers’ point of view. We now look at where Claris provides us with answers with the current announcements.

To put it briefly: a lot has been taken into account. Thank you Claris!

Refer to PDF for download (12.6 MB):
The Future of FileMaker – Review and Update 2023.

Blog: The Future of FileMaker – FMDiSC Meeting Oct 2022
Recording of the presentation on YouTube: The Future of FileMaker

FileMaker Magazin: Eine Einordnung – Was sich bei der Claris Produkt-Strategie ändert (in German language)

FM JSON Path

21. August 2023

FM JSON Path

Sometimes it would be useful to have a flexible way to access the elements within a JSON object. Something like XPATH for XML objects.

FileMaker itself offers only very limited JSON functions. With JSONGetElement it is possible to query a known path within the JSON object, but no wildcards or even search filters are possible.

This is exactly what the library “JSONPath” offers. It enables an XPATH-like query syntax for JSON objects. A detailed description with examples can be found on Stefan Gössner’s website:
https://goessner.net/articles/JsonPath/



This can be used to realise the following queries, for example:

  • $.store.book[*].author -> the authors of all books in the store
  • $..book[-1:] -> the last book in order
  • $..book[?(@.isbn)] -> filter all books with isbn number
  • $..book[?(@.price<10)] -> filter all books cheaper than 10

See more examples and full sample JSON on the project website.



There is a JavaScript implementation of this library which can be downloaded from the Google Code Archive:
https://code.google.com/archive/p/jsonpath/downloads

With the help of the JavaScript functions of the MBS plugin, the JSON Path library can be easily integrated into FileMaker and then called via script command:
https://www.mbsplugins.eu/component_JavaScript.shtml

Note the difference in the functions “JS.CallFunction” and “JS.CallFunctionValues”. While the first function expects the input of parameters in JSON format, the second function works with parameters in text form. If this difference is not observed, the values arrive incorrectly in the JavaScript function and cannot be processed. Instead, one receives an error of the type “invalid json (at offset 1)”.

Once these hurdles have been overcome, the content of a JSON object from FileMaker can be queried with a simple script via MBS using a JSON path expression. You get the result in text form or you can simply convert it back into JSON.



Attached is a FileMaker file with the examples from the JSON Path website.
The JavaScript library is already included in the FileMaker demo file.
The MBS Plugin version 11.3 or newer is required.

Download: FM JSON Path.fmp12 (319 KB)



Update 2023-10-02:
JSON Path has been implemented in the latest version (13.5) of MBS plugin. Please refer to MBS documentation and corresponding blog post:
MBS JSON.Query
JSON Query in FileMaker

Claris in dialogue

22. Dezember 2022

Claris Dialogue

There is something going on in the communication of Claris. In the meantime, new version updates for FileMaker have been shown and further innovations have been announced. While the past months were characterized by hopes and fears, confidence is now returning with regard to the future of the Claris platform.

Highlights of the past weeks were:

  • The Claris Product Team Webinar at the beginning of November
  • The Claris sessions at the engageU conference in Malmö
  • The Claris Partner exclusive webinar in early December
  • The release of FileMaker version 19.6
  • New announcements about Claris Connect (Custom Connectors!)
  • The announcement of the freemium version for the new Claris platform
  • The technical tutorials in the “Claris Engineering Blog” (refer link)
  • The blog post by Robert Holsey in the Claris Forum (refer link)
  • Q&A with Andrew LeCates at FMDiSC December Meeting (refer link)
  • New in 2023: Claris Update Webinar – Jan 25 (refer link)

The message from Claris is now clear: “We’re on it! We are building the future! Be patient!”. It was also reiterated once again that FileMaker will continue to evolve as a product and nothing will be broken. On the contrary, all improvements should benefit the entire platform. And wherever there are still gaps in “Claris Pro” in terms of functionality or technical features compared to FileMaker, which stand in the way of a migration, these should be fixed as soon as possible.

OK, Claris – we heard you!
Thank you for your efforts!

So we can be curious about the novelties in the upcoming year…




Links:

Images: Claris, @JorisAarts, @dotfmp