These are posts by David.

How to avoid collisions using UUIDs

There’s a simple way to avoid collisions when using UUIDs as primary keys: Use UUIDs as primary keys.

No, that’s not a typo. The idea that Get ( UUID ) will create a duplicate ID in a single table, in a single file, or even all tables in all FileMaker files you ever create is, simply, false.

I can hear someone out there (likely here) saying: “Whaaat? Just because it’s very unlikely, doesn’t mean it can’t happen”.

And my answer is: “Actually, it does mean that. That’s exactly what it means.” Having a collision of a properly created randomly-generated UUID so unlikely that when we start slicing words like “can’t” or “impossible” to include such an unlikely event we are presenting a false view of the world.

The likelihood of a collision is so absurdly rare, that it is impossible as far as human brains can conceive of it. With the levels of risk we’re talking about, saying, “it is impossible” to have a collision is more accurate than saying “it is possible”. It’s like describing the statement, “The sun will rise tomorrow” as being false, because “it’s possible” an alien civilization has launched a near-lightspeed moon-sized object at Earth that will hit tonight destroying the planet and preventing any more sunrises. After all, it’s possible.

This point is about properly created UUIDs. It is non-trivial to create a unique sequence of numbers and letters. While the process has been standardized, some people don’t follow the standards. According to David McKee, Senior Software Engineer at FMI, the Get ( UUID ) function uses the operating systems’ native UUID function. On macOS it is CFUUIDCreate() and on Windows UuidCreate(). FM engineers are smartly delegating the UUID creation to Apple and Microsoft.

It seems completely reasonable that FM’s Get ( UUID ) is rock solid.

The UUID RFC standard used by FM is “subtype 4 variant 1” where the UUID is randomly constructed (not based on the time or the computer ID). Pulling from Wikipedia’s page on UUIDs, the UUID t is composed of 32 hexadecimal digits (using the base 16 system of 0 through 9 and the letters A through F). The ID gets formatted by the standard which reduces the complexity by a few bits and, in the end, there are 2^122 bits of information which produces roughly 5.3 undecillion possible IDs. “Undecillion” is a cool word, but really, it’s impossible for the human brain can’t grasp the significance of its enormity. If you follow the 2, 3, 4 as billion, trillion, quadrillion, eventually you get to undecillion which starts with the Latin prefix for 1 and for 10 also known as eleven.

That number is can’t be understood because of its immensity. Let’s try to reduce it to human-relatable terms. First, few people are concerned about one particular UUID being duplicated; developers care if any duplicate occurs, which is an example of the Birthday Paradox. And then I’ll introduce real life variables to get a sense of the number.

Probability, like hugely large numbers, can be hard to wrap our heads around. We can’t just say, “What are the chances there’s a duplicate UUID?” We need to specify the risk we’re looking at. I think a one in a billion chance is a pretty safe risk.

How many UUIDs need to be created for there to be a one in a billion chance of a collision among any of them? The answer: 103 trillion.

Let ( [
//set probability to a billion.
probability = 1000000000 ;

//these calcs do the prep work
p = 1 / probability ; x = ln ( 1 / ( 1-p ) ) ;
y = 2 * 2^122 * x ;

//number of UUIDs needed
z = Sqrt ( y ) ;

//use r.factor to round result to leading 3 digits
r.factor = -1 * Length ( Int ( z ) ) + 3 ;

result = Round ( z ; r.factor ) ];
result )

There we are. To get a one in a billion chance of a duplicate, we’d have to create 103 trillion UUIDs. 103 trillion. That number is so big, there’s no real way to describe it, except to say it’s impossible.

Some humans, like writers and journalists, try to make Very Large Numbers accessible. “If you stacked 10 trillion dollar bills you’d reach the moon and back.” Which doesn’t actually help; it’s still incomprehensible because who can actually related to moon travel, except for Neil Armstrong and friends? (Side note, Neil Armstrong used to tell bad moon jokes and when no one laughed, he’d mumble, “Guess you had to be there….”).

Imagine there are 100,000 FileMaker developers (we and FMI wishes!). Each developer creates 10 files a year. Each file has 100 tables. Each table 10 million records. Each record uses a UUID.

In 10 years, there’s barely a 1 in a million chance that there will be any duplicate UUIDs among all UUIDs created by all developers in the world. There is drastically less likelihood that such a duplicate would have a real world impact.

The human brain isn’t great at understanding very large numbers or probability, but math doesn’t lie. For any reasonable definition of impossible, UUID collision is impossible.


Pause on Error Los Gatos: Day & Night

I went to the latest Pause on Error to learn what I don’t know. I walked away still not knowing a whole lot, but being energized to learn.

Held over Memorial Day weekend at “The Presentation Center” (a dry name for a wet place) in Los Gatos, California, this PoE  earned the nickname “FileMaker Summer Camp”. The bucolic location in the Santa Cruz mountains on a site loaded with natural beauty and Catholic iconography, was simply stunning. I saw a deer my first half hour on the property, and another trio a couple hours later. In the morning, clouds blanketed the tops of redwoods. In the afternoon, a free permit got us onto miles of trails in the bordering nature preserve. It may have been too ascetic for those hoping for a traditional tech conference experience. There was no hotel bar. No lazy river. No double beds, television, housekeeping, or even private bathrooms. The first session started at 6:30 in the morning.

Spartan by most standards, it was not everyone’s cup of herbal tea, but it sure was mine. It was pretty much exactly what I needed at this point in my life. Maybe lots of experiences could have been, but this one was.

The organizers, Todd Geist of Geist Interactive; John Sindelar, of Seedcode; and Ernest Koe of the Proof Group, intended to push 70+ developers to try something different and they succeeded. Todd’s yin meshed well with John’s yang as both exhorted/encouraged attendees to learn, adapt, and challenge themselves. They led by being vulnerable, creating a space where others could be too. The weekend was the most intimate conference I’ve been to.

Each morning and evening there was a 90 minute meditation led by John Tarrant of the Pacific Zen Institute and Todd Geist. Between 15-30 people were seated at each 6:30am session. Generally, conference attendees respected a silent breakfast after the morning meditation. The food was basic buffet and nourishing. Before dinner, Jonny Lee, a Chi Running instructor, led a attendees in a stretching and running technique.  John Tarrant led the evening meditation with a koan (which I learned is pronounced “ko-ahn”), and interlaced the quiet sitting with a discussion of the experience. One koan, asked by John Tarrant in his resonant voice: “Quickly, without good or evil, what is your original face before your parents were born?”

Between breakfast and the afternoon running session were more standard conference sessions. The intention of Pause is to be a “participatory” conference where attendees present, however, the sessions, scheduled with two in the morning and two in the afternoon for 2 1/2 days didn’t really pan out that way. The sessions I attended fell into three rough categories…technical, inspirational, and discussion.

I attended

Jason Young (Seedcode): Technical. Visual apps using <canvas> in a web viewer
Todd Geist (eponymous): Inspirational: Use APIs to access the immensely vast library of the world’s tech
John Sindelar (Seedcode): Inspirational: Embrace the “I don’t know”

Vince Mennano (Beezwax): Technical: Data Visualization usingTableau
Rosemary Tietge (FMI): Discussion. Filemaker Community
Todd Geist: Technical. Full stack, node.js

John Renfrew: Technical. Data visualization with d3
All: Discussion. Presenter / Spectator general discussion

There were several sessions I missed. Obviously, the few times sessions ran concurrently, I would have missed one. I also skipped a time slot every afternoon for a nap. And an extra afternoon session for a run. Those that I missed were:

Jason Young: hitting the SaleForce API. Matt Navarre: Running FM on AWS. Nancy Botkin & Mark Lemm: JSON. Lui de la Parra: Node and FM. Ernest Koe: Enterprise FM. Jason Young: cURL, card windows.

Of the sessions I saw the discussion and the inspirational sessions resonated like a singing bowl. The technical ones were too technical for me, and there was a significant  overlap between subjects.  Unfortunately, there were simply not enough sessions. I contributed to that by not presenting.

Evenings wound down differently. The first night I was asleep 20 minutes after evening meditation. The second, I had a quiet discussion with a couple great developers afterwards. And the third night, I spent hanging out at ‘Lower Maria’, the cabin where Canadians go to party.

My professional takeaways were more inspirational, around the importance of trying new things and pushing my knowledge into new technical areas. Namely, using API calls to web services to do the grunt work of development; learning javascript; and terminal / command line programming.

My personal benefits were an interest in Zen practice and an deeper appreciation for my fellow FileMaker developers.

I have found the conferences I enjoy the most are the ones in which I return energized. This conference was great in that I am coming home with something that I won’t just use in my professional life, but in my personal one too.

I can’t express my gratitude to the organizers, presenters, and other attendees enough. It was a wonderful experience.


-David Jondreau





Interesting quirk for unrelated field reference in field definition

One apparent limitation of using ExSQL() in the separation model is that you can’t use unrelated tables in a field definition. So you’d need to related all your tables to one another.

I just discovered that you can bypass that limitation simply by wrapping your statement in a Let() and declaring a variable that includes wrapping a field in the GetFieldName() function.

So the expression:
Let( field = unrelated::table ; field )
returns an error.

Let( gfn = GetFieldName ( unrelated::table ) ; field = unrelated::table ; field )
does not.

It doesn’t even have to be a field from an unrelated table. Interesting and hopefully helpful.

Triggering a server-side script Part II

Part II will include a custom function to create your URL, an example of a feeder script to manage the process, and an explanation of why the Guest account is necessary.

In Part I of this series, I explained how to trigger a server-side script using XML. In this part, I’m going to give you a custom function to aid this process and explain why you need to enable the Guest account.

First, the Guest account. To access the XML engine on the FileMaker Server by a browser, a user will always be taken to a login page, with one exception. That exception is if the Guest[] account is turned on. Since there’s no browser involved in the Insert from URL[] script step, and not way to auto-login, there’s no way to access the Server with anything but Guest[].

If you don’t have a need for Guest[] for anything but this, it opens a little security hole, but that can be mitigated. There’s several ways to do this, but I’ll only give one method.  First, turn off all Extended Privileges except Access via XML Web Publishing (fmxml). Then give all “No Access” to everything except the scripts you’re going to call by XML and the opening script and closing scripts. All those scripts and their called sub scripts should be run with “Full Access” checked. That way you’re letting the scripts manage their permissions. Again, there are other ways of accomplishing this that may be more secure, depending on your solution’s needs.

I love custom functions. They make life so much easier. You don’t need a custom function to call a script via XML, but it ensures you’re syntax is correct and reduces development time.

Here’s the one I created for this:


/* ( script )
Calls the script specified in the parameter in the current database. This function should be placed in an Insert From URL[] script step. [Guest] account needs to be active and have permission to run the script specified, all other access should be limited. The script should run with Full Access privileges.
David Jondreau
Wing Forward Solutions, LLC
http = “http://” ;
xml =  “/fmi/xml/fmresultset.xml?-db=” ;
host.ip = Get ( HostIPAddress ) ;
file = Get( FileName ) ;
layout = Get ( LayoutName ) ;
layout = “&-lay=” & layout  ;
view = “&-view” ;
script = “&-script=” & script ;
result = http & host.ip & xml & file & layout & view & script



Triggering a server-side script Part I

FileMaker does most of its processing client side. That means when a client wants to do a find, or show the result of a calculation, the data necessary to do the action is transferred from the server to the client and the client calculates the result. This is handy because you don’t need super powerful servers and the amount of load put on the server itself is reduced. But there’s a tradeoff, in some circumstances, it can also be extraordinarily painful. If a lot of data needs to be moved, the transfer becomes a huge bottleneck.

There is a solution to this. A FileMaker Pro ( or Go ) client can trigger a server-side script using a call to the web publishing engine. This is real handy if you want have a task that touches a lot of records or is otherwise data intensive. A script that would take minutes locally, takes seconds on the server.

The solution is to use the FileMaker 12 script step, Insert From URL[], to place an XML call to the Custom Web Publishing Engine. That XML call specifies the script to be run by the server. Pre-12, I believe you can use Set Web Viewer[] for similar results.

To implement this:

1) Enable Custom Web Publishing with XML on the server.
2) Write the script you want to run on the server. It can only use Custom Web Publishing compatible script steps. Check “Run script with full access privileges”.
3) Enable the [Guest] account in Manage Security. Allow only the XML Extended Privilege. Give access to no records, no value lists, and all layouts. You should choose to allow access to only the specific server side scripts you want to call.
4) Write your calling script. The calling script is what will run on your client. The core script step is Insert From URL[]. That should take a URL of the format http://HOST.IP.ADDRESS/fmi/xml/fmresultset.xml?-db=FILE.NAME&-lay=LAYOUT.NAME&-view&-script=SERVER.SCRIPT
5) Run your calling script from the client!

Part II will include a custom function to create your URL, an example of a feeder script to manage the process, and an explanation of why the Guest account is necessary.

Part III will be an exploration of how this process can be used to create FM Go “updateable apps”.

Running an iPad database as a Kiosk

Doug Alder of HomeBase Software out of Vancouver, BC posted an excellent explanation of locking down an iOS device to use an FM Go database in “Kiosk” mode, using the Accessibility options. While you’re there, check out his FileMaker Timeline (which isn’t working in Chrome for me, but in Firefox it looks great).

FileMaker Server 12 Plugins

tl;dr: Plug-in developers can now update their plug-ins to run in 64 bit mode.
There’s an undocumented change to how the Web Publishing Engine runs that affects where to install plug-ins to be run by CWP.

A few weeks ago FMI released the FMS 12 v2 update, adding support for 64 bit plugins used by CWP solutions. FMI also released a new software developer’s kit (SDK) for plug-in developers. Until this SDK, there was no way for plug-in devs to update their plug-ins to run in 64 bit mode. So some FMS 12 plug-ins (possibly, all, I’m not sure) could, until this update, only work in 32-bit mode. Which was okay for most purposes, but the Web Publishing Engine (WPE) runs in 64 bit (on a machine running in 64 bit mode…you could run your 64 bit machine in 32 bit mode).

So far, I haven’t seen any release of an updated plug-ins but I’ve read or communicated with several devs who are actively working on updates: Goya, Troi, 360Works, and 24U. I’m assuming most of the top-level plug-in devs are on the ball.

There is one poorly documented change that should be noted. It’s documented a pdf in the new plug-in SDK and nowhere else that I can see.

There is now a new folderpath to install plugins to be use by Custom Web Publishing (CWP). Instant Web Publishing (IWP) is the same filepath as before (/publishingengine/wpc/Plugins), but CWP plugins now go into (…/publishingengine/cwpc/Plugins).

This change is not reflected in an updated FMS document (otherwise a superb pdf). It’s in a pdf that comes with the plug-in SDK. When called by CWP, the function Get ( FileMakerPath ) also returns a /cwpc/ path.

Thanks to Nick Orr at Goya and Diana Budding at Troi for being patient with my questions and to Obinna Oparah at 360Works for a couple informative posts on FM Forums.

SMTP & Gmail

Sometimes someone else posts the definitive piece. Read this blog post by Lee Lukeheart, founder of Savvy Data, for the skinny on Filemaker and Gmail.

Pre-fill new fields with related data, quickly

Sometimes you need to add a field to an existing database that should have values for all the already existing records in the table.

Say you’ve added a new field on an existing database.

It’s not a calc field, but will have an auto-entered calculation field that will always be filled out.

For example, in the table People you add a field called “Full Name”. It’s a combination of First Name and Last Name fields which are in the same table.
= First Name & “ “ & Last Name.
You want users to be able to modify the combination, so it’s an auto-enter field and not an uneditable calculation. Easy enough, except that for existing records, there’s no value in the field for any of the existing records.

You could script adding data, using Replace[], or a Loop and Set Field[], but that can take a long time, especially if the table has a lot of dependencies, there’s a large record set, and / or you’re working on a served file. Instead you can leverage the power of the Filemaker’s storage and one little trick.

The trick is if you change a stored calculation field to a non-calculated (Text, Number, Date, etc) field, the values persist in the field.

Make that field a regular calculation field. Save out of Manage Databases. Filemaker then store the data and al your records now have the value you want, but in an un-editable calculation field. Re-open Manage Databases and change the field to Text, check the auto-enter calc box (the calculation should auto-fill) and Save out again.

Now you have a Full Name field where every existing record has the calculated value pre-filled in!

Additionally, this can work with related data.

If you want to pull the Full Name field from your Contacts table into an Invoice table, but have it stored, you would create a text field with an auto-enter calc that looks through a relationship based on INVOICE::fk ContactID <–> CONTACTS::pk ContactID.

To pre-fill all the existing records you again make it a calculation field, but since Filemaker won’t let you store a calculation that references related data, we get help from our friend Evaluate(). Wrap the related table field name inside an Evaluate()
= Evaluate ( “CONTACTS::Full Name”)

You can then store that calculation and save to exit Manage Database. Open Manage Database again, change the field to a text field, remove the Evaluate() wrapper and save to exit again.

Your  new auto-enter text field now has related data in all the existing records.

Migrating Word Press

We’ve got a few websites based on WordPress. They’re little sites and have been hosted with GoDaddy since launch. GoDaddy is cheap, cheap, cheap, and good enough for the sites we’ve had up. But we want to drive more traffic to those sites, take hosting control of a couple other more complicated sites, and get out from GoDaddy’s limitations.

We settled on a hosting reseller account through HostGator. That gives us room to expand and freedom to configure the server as we want.

This was fairly easy. We logged into cPanel and then followed the directions in this video here. The video is great and there’s no need to re-say here what they say there. Basically it shows how to set up ‘packages’, which determine how much space and bandwidth a particular site will get.

Then you can ‘Add a New Account’ in cPanel where you tell it the domain name and choose a package for its settings.

After we had the account for the new set up, we followed HostGator’s directions for moving a WordPress blog. (See here ).

Their directions are pretty good. We backed up the database, downloaded and uploaded the WP files, changed wp-config.php and imported the blog as directed.

Ta-dah! We now could go to our new IP address and see the site. However, the css wasn’t taking effect and the links all went back to our old site.

At the end of their directions HostGator has a little section on this very problem. They say you should log-in to wp-admin for your site and in the General Settings change the site url and home to the new address. We did this and then realized because we hadn’t changed the DNS name servers yet, we now had a loop happening which didn’t allow us to log back into wp-admin! In effect, we changed the old database instead of the new database. Now the new database was telling it to go to the old url and the old database was telling it to go to the new url! To fix this, we followed HostGator’s directions here. This meant logging into phpMyadmin for both databases and changing the url and home variables in both places.

Then we logged into the the new wp-admin and changed the links from ‘pretty’ links to the default setting.

Now we had wp-admin’s accessible for both versions of the site and the links were working, but our css still wasn’t taking effect. We looked at the page source and it was telling it the right link to the file. We tried uploading a new image and placing it in a new post. The wp-admin could find the photo just fine, but the page wouldn’t display it.

Finally, we thought to look at the Error log in cPanel. It told us over and over ” (13)Permission denied: /home/public_html/wp-content/uploads/.htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable” for this files and various other ones in the wp-content file.

After doing some web searching, we read of other accounts of there being problems with FrontPage, a WYSIWYG editor. We uninstalled it. CSS still wasn’t taking effect. Then we took a look at the permissions on the wp-content file. On the actual file they were fine, but the lower files with the themes, uploads, etc, were set way too low. So we changed the entire contents of the wp-content folder to 755 and all of its files to 644.

Now the CSS is working!