Minutes, Not Hours

Software products for 3-D model compliance checking exists, but code-based rule-sets to govern the process are not yet well developed or tested. Nor is the creation of an automated code-checking workflow, and many of the intended users—code enforcement officials—have little exposure to any of it.

Gould says the goals of the AutoCodes project were to try and address all of that by engaging code officials in the process of trialing the software, a specific rule set for A&E, and the processes together. He says the committee decided to start its project by working with A&E codes, which are largely based on objective values for clearance distances and geometry, and are similar across all industries covered.

Industry partners included the Target Corp., which supplied the plans and model, the company Gould works for as director of marketing, Avolve Software, which supplied a portal and a "semi-automatic" 2-D plan review management system called ProjectDox Electronic Plan Review; and Solibri LLC, the vendor of  Solibri Model Checker, which is software to check 3-D models for quality and compliance with sets of rules.

Code authorities that cover the 13 jurisdictions conducting the initial 2-D review included "real-world" plan review departments and their personnel from jurisdictions in Bend, Ore., Salt Lake City, Mecklenburg County, N.C., and Philadelphia.

Several Texas jurisdictions participated as well, including Amarillo, Carrollton and Houston, as well as four California locales: Irvine, Livermore, Redwood City, and San Jose. Finally, state code officials in New Jersey and New York participated too.

Gould says the manual checks usually took four or five hours, while the 3-D model software check "took about a minute."

Paper Trail

The model checking software found anomalies. The checker found a ramp that was too short that all the manual checkers had missed. The checker also returned many calls for more information where it found details in the model insufficient for it to apply its rules. The lack of space classifications was a common problem.

One mechanical room failed the automated compliance check, Gould says, because it wasn't designated as mechanical room in the model, so the code checker erroneously applied the same standards to it that it would have to a public space. The room actually did not need to comply to the same standard, but at least the code checker, by declaring it failed, demonstrated that the room had been examined. In fact, the software logs everything it checks and its performance can be audited. That is not the case with manual code review, Gould notes.