Friday, July 27, 2007

Sun Improves Ultra Performance And Lower Prices (01jun97)

June 1, 1997

For a number of years, UNIX workstations from Sun Microsystems lagged other vendors in this market segment in regards to raw performance. With the introduction of the UltraSPARC-I last year, Sun became much more competitive. The company recently introduced a new 64-bit 300-MHz UltraSPARC-II microprocessor that with 4 MB of L2 cache memory, has a SPECint95 rating of 12.7 and a SPECfp95 rating of 19.7. On this basis, it outperforms Hewlett-Packard’s 180-MHz PA-8000 by about 10% and has about 80% the performance of Digital’s 600-MHz Alpha 21164. According to Sun, it outperforms the new 266-MHz Intel Pentium II by 30% to 90% on engineering applications.

This new chip is available in both single and dual processor workstations called the Ultra 2 Model 1300 and Model 2300 respectively. The base system comes with 128 MB of main memory expandable to 2 GB. Each unit can support up to 8.4 GB of internal disk storage and an incredible 273 GB of external storage. We were surprised however that the maximum transfer rate is only 20 MB/second since 40 MB/second technology is readily available. These units support either Sun’s Creator or Creator3D graphics.

Prices for the Model 1300 and Model 2 workstations with Creator graphics start at $26,495 and $38,495, respectively. Pricing with Creator3D graphics is $27,495 and $39,495. These new systems, which provide a 30 percent performance boost over the company’s 200MHz former flagship models, are being offered at the same price the 200MHz systems were before the company cut the prices of existing products by about 30%.

The Ultra 1 Model 140 workstation, which listed for $9,995, has now been replaced with the Ultra 1 Model 170 workstation, with a list price of $9,195. The Model 170 is architecturally the same as the Model 140 but includes a 167MHz processor instead of a 143MHz processor. Sun has also reduced prices for the SPARCstation 5 workstation with a 20" monitor, to $5,895 from $6,895, mostly due to lower monitor costs.

Raster Software And AutoCAD R14 (01jun97)

June 1, 1997

Rasterex and GTX Corp. separately announced that their respective AutoCAD-based raster editing and conversion software lines will work with AutoCAD R14 today (Rasterex) or within the next few months (GTX).

AutoCAD’s New Raster Ability Establishes A Foundation

AutoCAD R14 includes a new capability to recognize and handle scanned raster images as AutoCAD objects that can be scaled and positioned just like AutoCAD blocks. This makes it easier to integrate scanned paper drawings, blueprints, and digital photographs into AutoCAD files as hybrid drawings containing both raster and vector data. Anyone using AutoCAD or R14-based applications can view and plot these files.

Uses for this capability in conjunction with R14-based applications range from paper-to-CAD conversions (partial or complete) to digital terrain modeling (DTM) in geographic information systems (GIS).

The raster architecture was developed internally by Autodesk in conjunction with input from the AutoCAD Technical Imaging Partners group.

Rasterex Support of AutoCAD R14

Rasterex’s RxAutoImage 97 add-on software family works with AutoCAD R13 and R14 as well as AutoCAD LT95. Their software extends R14’s whole raster object capability to include selecting just part of an image for editing and/or conversion to vector. This includes individual entities such as lines, arcs, and circles.

The company reports its key functionalities include line-following with orthogonalization, single-pick, object tracing, single-pick hatch object selection, automatic clean-up wizards, advanced rubbersheeting, and raster-to-vector conversion with optical character recognition (OCR).

RxAutoImage Pro 97 retails for $3,895 while RxAutoImage Edit 97 lists for $1,995. The less expensive package does not include automatic raster-to-vector conversion or OCR conversion capabilities. Expert Graphics is the U.S.-based distributor of Rasterex software.

GTX Offers Two Software Lines

GTX noted that with R14’s capabilities, previous capacity limitations in its AutoCAD-based software no longer occur. This includes the number and size of images that may be loaded at a given time. Multiple images can appear simultaneously in multiple view ports.

GTX now offers two software lines that provide raster editing and conversion. The GTXRasterCAD 4.0 series works inside of AutoCAD while their new ImageCAD series offers the same capabilities (including commands and DWG read/write) by itself. ImageCAD uses the AutoCAD OEM engine at its core.

The commands have the same names as AutoCAD commands, but with the prefix G. Raster editing, manipulation and conversion is greatly aided by a neat feature called intelligent object picking (IOP). IOP recognizes basic entities such as arcs, circles, and lines even when they intersect with other entities. The IOP feature makes working with raster files less tedious.

Raster to vector conversion works decently from individual entities all the way to entire files. Text conversion from raster to vector is also good. Contour map conversion is handled by special programs designed to handle dense contour lines. We watched a few demonstrations of this capability and were impressed by how well it worked on a Pentium PC. It’s worth noting that the resulting file format is proprietary.

Batch conversions of scanned files to AutoCAD files can be done with the GTX OSR module and text is converted using the GTX ICR PLUS module. These modules work with both the RasterCAD series and the ImageCAD series.

Prices are charged per user per application. For RasterCAD this varies from $1,995 to $4,890 while ImageCAD varies from $1,395 to $4,990. Versions are available for AutoCAD R14 and R13 on Windows 95/NT as well as AutoCAD R11 and R12 on DOS, Windows, and UNIX (Sun and HP).

Expert Graphics, 800-648-7249, 404-320-0800,
Rasterex (International) +47 22 23 92 90,
GTX Corp., 800-879-8284, 602-244-8700,

Two Different Books On GIS Suit Beginners and Non-Technical Executives Very Well (01jun97)

June 1, 1997

The GIS Book, 4th Edition, Understanding the Value and Implementation of Geographic Information Systems, by George Korte, P.E.

GIS: A Visual Approach, by Bruce Davis

These two books explain different aspects of geographic information systems (GIS), but both do a great job of explaining GIS without requiring readers to have a technical background. The GIS Book, 4th Edition explains the core basics of the technology but spends most of its time covering other major factors that really make or break a GIS implementation. This is the book for executives, managers, and users that want the big picture with relevant details and guidelines.

GIS: A Visual Approach, primarily focuses on explaining the technology without jargon and with many diagrams and pictures. The author wrote the book in a way that even non-native English speakers should pick up the technical fundamentals without much trouble. Many of the diagrams and pictures are supplied on accompanying diskettes for computer display, printing, or slide making. While this book is an excellent GIS primer, we feel that it contains information that many of our readers will consider too basic. Therefore, we will concentrate on the other book in this article.

The GIS Book, 4th Edition

This new edition "was written for people who want to know about the selection, implementation, uses, and benefits of GIS, but don’t need to know all of the technical details of how GIS actually works," according to author George Korte, P.E.

The book is divided into three parts. The first part explains the fundamentals of the technology, what can be done with it, and an overview of the GIS industry. The diagrams and pictures included in this section are enough to establish a very basic idea of how CAD, AM/FM, and GIS differ.

The second part provides some very helpful guidance in selecting and implementing a GIS. The step-by-step implementation process diagram is worth the price of the book alone for those getting started. This section also reviews four leading GIS software vendors (ESRI, Intergraph, Landmark Graphics, and MapInfo) using their sales literature minus unsubstantiated claims. No comparisons or evaluations were given but it quickly demonstrated the different competencies of the vendors. Besides, the author happens to work for Intergraph.

The third part reviews key factors about GIS such as financial justification, legal aspects, economics of base map accuracy, and getting CAD data into a GIS. Several case studies are provided as well. The details provided were both interesting and useful.

Some of the more notable points from the book were:

  • A GIS program is neither a small nor simple undertaking (it is very different from selecting a CAD package for a design group).
  • It is far more critical to determine your organization’s needs (not just a department’s needs) for the GIS and decide which GIS package fits this best than it is to determine how the different systems compare each other.
  • The cost of data conversion can range between 60 and 80 percent of the total cost of implementing a GIS. The two most overlooked factors are completing the data entered and checking/correcting it once the data is in the system.
  • Consider teaming up with other companies or organizations in your area to share the cost of creating the GIS database.
  • The root problem for unsatisfactory GIS programs is usually oraganizational and operational issues, not the hardware, software, or a vendor.

We found several small mistakes regarding computer technology in the GIS Book, but these were not critical to its explanation of GIS.

Contact: Onword Press, 888-763-8786, 505-474-5120,

Pentium II, K6 and 6x86MX CPUs Live Up To Expectations (01jun97)

June 1, 1997

Computers featuring the new Intel Pentium II, AMD K6, and Cyrix 6x86MX central processing units (CPUs) are now widely available in the market, so we decided to take a look at some underlying details regarding these workstation-class processors.

The Pentium II has turned out to be more than just a multi-media enhanced Pentium Pro. The new processor heralds a new mini-motherboard design for the processor and the L2 cache, it runs 16-bit operating system code and applications with no penalties, and the speeds are faster.

Pentium II Easier to Build, Less Elegant, But Still Faster

The Pentium Pro contained its processor and L2 memory cache next to each other in one piece and both ran at the same clock speed. This yielded great performance but apparently was difficult to manufacture. While we can’t say exactly what occurred at Intel with regard to the Pentium Pro’s evolution, this manufacturability issue probably drove the redesign more than anything else.

The Pentium II has essentially become a mini-motherboard (or daughter card) of discrete components encased as a cartridge that plugs into a computer’s motherboard. This breaks away from the tradition of more components on a single chip.

The CPU inside the cartridge is smaller, and as a result, it is faster since the electrical signals have less distance to travel. The L1 memory cache inside the chip has grown from 16K to 32K (16K instruction/16K data). The L2 cache has grown from the prevalent 256K to a standard 512K, but the distance to the CPU increased and the L2 memory speed dropped to half that of the CPU.

This overall arrangement still results in better performance since the processor can store more information in the L1 and L2 caches before going to main memory which is dramatically slower.

No 16-Bit Limits, Plus MMX

The Pentium Pro actually ran 16-bit code slower than its Pentium siblingsdid as a result of the assumption that users would be primarily using 32-bit based programs. Thus, the Pentium Pro architecture did not incorporate features that were 16-bit oriented such as frequent updates of its segment registers.

The Pentium II architecture incorporates faster updates as well as the 57 new instructions for multimedia (the MMX capability). The hardware and software architecture changes result in PCs that technically run faster although Pentium Pro users using Windows NT and 32-bit programs are unlikely to notice any dramatic differences.

For many companies there is a noticeable benefit to this 16-bit code efficiency. Setting up and managing Windows 95 is easier than Windows NT. Small businesses can network and install new computer devices on their own without the PC experts that are often required for NT (UNIX is much worse). However, Windows 95 isn’t an option if your primary programs and/or hardware do not run with Windows 95 or if you want the security options available with NT.

Other Considerations

Pentium II CPUs are limited to single and dual multi-processor configurations for the near future. That’s fine for workstations as we have seen with a loaned Intergraph 425 TDZ running leading CAD/CAM and analysis applications. Meanwhile, servers that need four processors will continue to be made with Pentium Pro CPUs.

It turns out that the Pentium II has some sort of esoteric computational bug. Intel says it is working on a software fix, but in-depth tests by IBM have not found any problems running applications. We appreciate Intel’s handling of the matter, and we don’t think the bug is worth delaying any Pentium II purchases you may be planning (we’re buying one ourselves).

Lower administration PCs (NetPCs) have also been demonstrated with the Pentium II. A few vendor tests have demonstrated Pentium II PCs being fully configured over a network by a server. For big companies, this will be of immense help if it works.

Clone CPUs Face A Lock-Out

The Pentium II’s cartridge holder talks to the rest of the system via a new connection called Slot 1 and a proprietary bus on the motherboard called the P6 bus. The P6 processor bus currently runs at the same speed, 66 MHz, that the previous non-proprietary Pentium bus ran at. But Intel plans to increase this to 100 MHz by 1998 with new motherboard chipsets that will provide additional undisclosed features.

The extra performance sounds great although the proprietary PxGET

HP Reaffirms Its Commitment To UNIX (01jun97)

June 1, 1997

We recently had an opportunity to attend Hewlett-Packard’s 1997 executive briefing for industry analysts and came away impressed with how quickly the company is reacting to the changes sweeping through the computer industry. HP seems to redefine itself more rapidly than any other major company in this industry. The three key subjects this year were the continuing explosion of the Internet/intranet, the emergence of electronic commerce as one of the growth areas of the future and the recognition that Windows NT has become a key industrial tool.

From Lou Platt (president and CEO) and Rick Belluzzo (executive vice-president and general manager of HP’s Computer Organization) on down, everyone at HP goes out of their way to emphasize that UNIX will continue to play a significant role in the company’s product line for years to come. Microsoft’s Windows NT is becoming increasingly important within the technical community, but large organizations seem reluctant to make a wholesale switch at this time.

There are two reasons for this; many engineering and information technology managers believe that UNIX is more reliable and more scalable than NT, and since the price of UNIX workstations has come down dramatically recently, few managers want to incur the pain of an operating system transition to save perhaps less than $10,000 per seat.

HP is taking a number of steps to maintain its existing UNIX business while at the same time taking advantage of the growing interest in Windows NT.

1) It has merged together the previously separate technical and commercial UNIX servers operations. The differences between the two classes of servers are fairly inconsequential today and many organizations use the same systems to support both the high-end computational needs of engineers and the data processing needs of MIS departments.

2) More significantly, HP is moving its UNIX workstation organization into the company’s Personal Information Products Group. This will facilitate providing users with a mix of UNIX and NT desktop systems. In the past, the Workstation operation has had a very pro-active industry marketing operation while the PC operation had little activity in this area. We believe that combining the two groups in one umbrella organization will result in a more coordinated product strategy for HP’s technical customers.

3) To support the blending together of the UNIX and NT worlds, HP has introduced new software products that facilitate operating mixed environments including the Enterprise File System which we discussed last month.

4) As we also described last month, HP is forging a closer working relationship with Microsoft. This relationship will be particularly important later this decade when Intel and HP release their mutually developed Merced (or I-64 as it is known in some quarters) microprocessor. That chip will require a new 64-bit version of Windows NT if it is to be successful.

Multiframe — CAD Tools For Structural Engineers (07jun97)

June 1, 1997

We recently came across a company that we bet few of you have heard of, but we think you will be impressed by their products. These design and analysis packages for structural engineers, collectively known as Multiframe, are developed and marketed by Formation Design Systems (formerly Graphic Magic). Interestingly, the Scotts Valley, California-based company began in the mid-1980s in Australia (where the software is still developed today) as a surface modeling and structural tool vendor for boats and ships.

Today, Multiframe is an extensive range of products for a variety of tasks performed by structural engineers. The company says that its users worldwide use Multiframe for many different types of design, including steel, concrete, and aluminum construction.

The Multiframe Range of Products

The Multiframe line consists of:

Multiframe 2D - This is an analysis tool for small scale problems that can be adequately solved in two dimensions. It has all of the modeling commands found in the 3D modules, and lets you quickly model, load, and analyze 2D structures. As examples, a simple continuous beam, portal frame, or more complex truss can be readily sized, managed, and analyzed.

Regularly spaced structures are easily handled as Multiframe’s 2D snap-to-grid, automatic generation, and duplication rotation, and extrusion functions are well-suited to repetitive model construction. A built-in library of common structural shapes lets you analyze steel, concrete, or timber frames. You can also include sections made from any other material or automatically include sections design using another of the company’s applications, Section Maker.

Price is $495.

Multiframe 3D - This 3D analysis tool contains a built-in range of methods for automatically constructing geometry, restraints, materials, and loading conditions. All types of framed structures can be modeled and can include special structural features, such as springs, prescribed displacement, member releases, and pinned joints.

Multiframe 3D provides automatic factoring of load cases, automatic inclusion of self weight, and a range of commands to automate the generation of regular geometry, such as continuous beams, curved beams, trusses, and high-rise frames. Like the 2D product, Multiframe 3D contains a built-in library of common structural shapes that lets you frames composed of several materials. You can also include sections made from any other material or automatically include sections design using Section Maker.

Price is $1495.

Multiframe 4D - This dynamic analysis tool helps you determine the dynamic response of frames being designed. It features a fast solver that uses sub-space iteration to quickly find the natural modes, frequencies, and periods of structures being designed. For more complex problems, Multiframe 4D’s time history response option lets you simulate a structure’s response to seismic or other loading over time. A built-in library of earthquake spectra can be used or you can enter and store your own data sets for use with Multiframe 4D.

Either lumped or distributed masses can be used to automatically take into account the self weight of a structure. Because you can control the number of mode shapes or time steps and convergence criteria, the module can calculate a quick estimate of dynamic response or a more rigorous, accurate solution. When analysis is complete, a mouse click on a joint or member displays a graph of its response and behavior over time. Alternatively, the entire structure can be animated to show the total structural response.

Price is $1995.

Section Maker - A custom structural shape is sometimes the best solution to a structural design problem, but designing and calculating the physical properties of these custom shapes can introduce unintended problems. Section Maker eliminates the need for any calculation of section properties by automatically computing the properties of sections made from virtually any material. Section Maker lets you quickly design and construct a shape and then integrate this shape into the Multiframe library of sections. Built-up sections, including standard shapes, such as I-beams, channels, hollow sections, and tubes, can be automatically created by entering a few key dimensions. Any standard shape from the sections library can be placed and combined with other sections. Once you have the shape you need, you can install it in any group in the library and used in any Multiframe structure.

Section Maker also lets you automatically create a range of sections that interpolate between two other shapes. Tapering members can be simulated or you can create a whole table of shapes by designing the parent form.

Price is $395.

Steel Designer - The key to efficient steel structural design is always the optimization of a structure in accordance with applicable codes of practice. If you are using Multiframe for analyzing steel structures, Steel Designer automates the checking of structures to steel codes of practice. Steel Designer is integrated with Multiframe so that it can directly access analysis results and perform design checks. However, Steel Designer is not just a "black box," it gives you control over all aspects of the structural design process — for example, which clauses you want to check or what effective lengths you want to use. Steel Designer also provides you with a set of design aids for searching through a structure for specific design criteria and then graphically highlighting these areas of interest.

Detailed tables of stresses, deflections, and forces for a whole structure or a single member let you determine critical loading conditions. Steel Designer provides a complete range of output that can combine text, tables, and diagrams. All data can be copied to other office or drawing applications for further documentation.

Price is $695.

Data Exchange

Multiframe products are not really intended as a standalone applications, but rather for use in conjunction with other products for performing detailed design, drafting, solid modeling, and rendering. The key here, of course, is effective data exchange. The Multiframe family supports a few standard data exchange formats, including IGES (reads and writes data to other applications such as MicroStation), and DXF (reads and writes to other applications such as AutoCAD), and exports 3DMF and QuickDraw 3D files to various packages. Tabular data can also be copied and pasted from Multiframe products to most popular spreadsheets, such as Excel.

Bundles and Free Technical Support

Multiframe products are currently available for Macintosh Power PC and Windows NT and 95 platforms.

The company offers three discounted product bundles that include:

  • Starter Bundle - Multiframe 2D, Section Maker, and Steel Designer for $995.
  • 3D Bundle - Multiframe 3D, Section Maker, and Steel Designer for $1995.
  • Professional Bundle - Multiframe 4D, Section Maker, and Steel Designer for $2495.

All products and bundle purchases come with free upgrades and free technical support for the first 12 months following the initial purchase.

Quite frankly, before we wrote this article, we didn’t know much about Formation Design Systems and Multiframe. Now that we do know a little something, we are quite impressed by the big ideas emanating from this little company, and we think you might be, too.n

Contact: Formation Design Systems Inc., 408-440-0702,

EA Systems—Managing Plant Lifecycle Data From Concept Through Operation (01jun97)

June 1, 1997

As 2D and 3D CAD packages revolutionized the speed of general technical drawing production in the mid-1980s, owner/operators and engineers of process and power generation plants thought they’d found the end-all to production snags. They were impressed with the ability to quickly generate plant drawings and keep them filed electronically.

Now, in the late 1990s, the need to produce drawings quickly has taken a back seat to something much bigger — plant lifecycle data management (PLDM) capabilities — the ability to maintain an intelligent computer-generated data model that supports all the information management issues in the 30- to 40-year operating life of process and power plant facilities. Data-centric enterprise automation systems produce drawings on demand and also provide many high-end functions for design and analysis.

Driving this movement and transition to PLDM are the plant owner/operators, who have come to expect more than just drawings as the final deliverables on their engineering projects.

It’s a change in product philosophy and scope being driven by customers rather than by the vendor community, but one which companies, like EA Systems Inc. have been striving for. Andrew K. Selden II, chairman and CEO of EA Systems, is a strong proponent of this concept and his thinking has influenced a suite of products able to "capture, employ, and maintain essential plant data over the plant lifecycle." This data-centric philosophy employs software to build an integrated 2D logical and a 3D physical model of a plant in a database. According to Selden, "What you see on the screen is a graphical representation of what’s in the database; it’s not just a drawing. Our software maintains data about a plant … we have a 3D physical model and a 2D intelligent schematic model."

The 3D model is a graphic spatial or geometric representation of the plant with dimensions for each room, each object, each pipe, etc. The 2D model is the logical model, showing the "flow" of a line attached to a certain value, pump, heat exchanger, etc. "These functions need to be integrated," he explained, "In order to gain benefits like data consistency and change management."

EA Systems’ PLDM permits a company to capture data early, and build a coherent data model in a database only one time. The company can then modify, change, expand, and build cumulatively on that initial electronic model of the plant as changes take place.

Data-Centric Beginnings

Selden’s quest for an integrated data-centric product suite started in 1994, when he acquired and made a commitment to EA Systems. Since that time the company has grown from 25 people to more than 70 today, with offices in the US, England, and Asia. Selden attributes the company’s current, rapid growth to changing client expectations and EA Systems’ list of powerful "case study" installations where the PLDM approach and products have proven themselves.

The Rohm and Haas Co. (Bristol, PA), for example, uses software from EA Systems in initial phases of a "4D-PLDM" effort, which adds time to the equation so that the longer PLDM tools are used, and the earlier in the cycle, the more users can expect to save. With help from EA Systems, Rohm and Haas is implementing a plan to reduce its capital expenditures budget by 50 percent while, at the same time, reducing its time-to-market for new plants by 50 percent. These benefits are expected to generate savings of several hundred million US dollars over the next three years.

Changing Hands In The Past

The centerpiece of EA Systems is PASCE (pronounced "pace"), an acronym for Plant Applications and Systems for Concurrent Engineering, which is used extensively for building 3D models and databases of process and power plants. It was initially developed by a partnership of Duke Power Co. (Charlotte, NC), ICI Plc (Runcorn, UK), and Combustion Engineering (Stamford, CT) as a "very practical, results-oriented package."

While PASCE was generally acknowledged as a strong and proven lifecycle software product, its development was slowed through a series of corporate ownership changes. Asea Brown Boveri (ABB) (Lucerne, Switzerland) acquired the product suite when it purchased Combustion Engineering, but then sold a majority interest to Digital Equipment Corp. (Maynard, MA). In November 1994, essentially all of the PASCE-related business was purchased by Selden and a core group of senior managers and employees.

Corporate-level changes had little impact on users of PASCE software, many of whom worked closely with the product’s development team members to enhance functionality and deploy the software throughout the various lifecycle phases of plants — preliminary design, detailed engineering, construction, operation, and maintenance.

Rather than just pay lip service to the importance of intelligent schematics to the plant design process, in December 1994 EA Systems went to Cambridge, England for an acknowledged plant expert, James M.D. Merry, founder, in 1982, and managing director of Advanced Systems Consultants Ltd. (ASC), and acquired his company. ASC developed Phoenix, a comprehensive, database-driven schematic design system with wide-spread acceptance in Europe. Phoenix is the basis for EA System’s PlantSCHEMA product.

"There was a natural intellectual fit between the products," Merry pointed out. Duke Power and Electricite de France (Paris, France) were among the earliest customers to adopt the 2D tools to design nuclear power plants and manage a full model of those plants through use of the logical model. "Customers had recognized that having both the 2D and 3D products was quite valuable — that’s what sets PASCE apart from other systems," Merry said.

1995 was a transition year for the company in developing new products and working with beta sites like Duke Power to fully integrate and test various modules. In early 1996, PASCE was introduced for Windows NT in addition to Windows 95, UNIX and OpenVMS. It fully incorporated PlantSCHEMA for 2D intelligent schematics and PlantVIEW, a 3D plant modeler for better plant-wide data consistency and single-time data entry. Several complementary modules also were introduced.

The data-centric products caught on quickly with buyers around the world, many of which have used EA Systems’ PASCE software since its inception. LG Engineering Co. Ltd. (LGEN), an international engineering and construction firm with headquarters in Seoul, South Korea, uses PASCE as its core application for a worldwide concurrent plant engineering system. The system lets LGEN’s engineers, drafters and designers work on projects simultaneously from remote locations. "Our experience using PASCE on a variety of projects over the past four years has confirmed that the software provides significant efficiencies for the multidisciplinary plant design environment," said Jung-Hee Won, senior managing director for the engineering division at LGEN. LGEN has used PASCE applications since 1991.

Some of EA Systems other major customers include Dow Chemical Co. (Midland, MI), 3M Co. (Minneapolis, MN), and Samsung Engineering Co. Ltd. (Seoul, South Korea).

The Turnaround

The company acquired by Selden and the management team in late 1994 was eight years old, had a good product as a result of a sound beginning, good funding and a stable management team, but a very limited presence in the market. Revenues in 1994 were just $4.1 million. Despite 1995 being a year of transition, incorporating the ASC team from Cambridge and developing a version of the product for the rapidly growing Windows NT market, revenues increased 25 percent to $5.1 million.

The first year that could take advantage of the investments made in product and people, 1996, showed a revenue growth of almost 50 percent. For 1997, the management team predicts another year of growth with revenues of $11.5 million.

Selden came on the scene at the right time, but this turnaround is not his alone. The employees and customers of EA Systems readily acknowledge the good fortune of Andy Selden happening along when he did to effect the management buyout (MBO). However, the company is in a good position today because of the strength and commitment of the key players and the fact they obviously work well as a team.

Selden has a BS in engineering from Georgia Tech, an MBA from Harvard, and is a former executive vice president of E.F. Hutton and managing director of Bankers Trust and Chemical Bank. He brings the benefit of more than 25 years experience in investment banking, merchant banking and venture capital to EA Systems; technology-based and service-oriented companies are not new to Selden.

EA Systems includes many charter members from more than ten years ago and is led by its president Shiraz M. Jaffer. Jaffer holds a BS degree in civil engineering from the University of East Africa and has more than 20 years experience in nuclear plant design engineering and plant information management. Jaffer’s background doubtless has contributed to the company’s success in the utility industry.

Shelf Life

PLDM may be new to some readers, but it’s a concept that’s been around for some time and one that will continue to dominate plant automation into the future. Few companies are actually using software for PLDM because most commercial CAD packages haven’t taken a data-centric approach and too many users are not sure where to turn. To succeed, EA Systems realizes it must continue educating customers on its data-centric approach, and assure them that PASCE is an "open" system with access to data in and out of other packages and can be linked to other software and used in the lifecycle phases earlier and later than the plant design phases. By using a data-centric model earlier in the process, EA Systems believes companies maximize benefits and achieve bigger savings.

For example, at Duke Power, PASCE is used during the detailed design and the post-construction phases for lifecycle management. "At two nuclear plants, Duke used other tools to capture data and moved that data into PASCE where it resides. The data can be accessed by other packages like their maintenance management software program, work scheduling programs, and safety and nuclear radiation monitoring programs. PASCE is used as a graphical navigation window into their plant. When they want to see what’s going on in a particular area, or they want to model their work management, they pull up the model and object with PASCE and they know what modifications are being performed," Shiraz Jaffer explained. "They are also linking PASCE with their real-time systems to monitor what’s going on in the plant linked to their electrical systems, instrumentation systems, radiation monitoring systems, etc."

This enterprise-wide outlook means a company like Duke Power will use the software for 30, 40, or even 50 years — literally throughout the facility’s life.

The PASCE Family Of Products

All PASCE applications run under Windows NT and UNIX platforms. Windows NT is a platform that EA Systems views as especially strategic, because the company realizes that Windows NT interoperability provides an opportunity for further integrating physical plant data assets with information management packages from other vendors.

The key products of the PASCE applications suite include:

PlantSCHEMA 10 - a comprehensive 2D schematic modeler that consolidates both 2D graphic and non-graphic information into one central database. All plant design and operating information, including P&IDs, resides in this single database, providing a tool that can be used throughout a plant’s lifecycle.

Standard features of PlantSCHEMA include:

  • Object connectivity
  • Component catalog
  • Scaleability
  • 2D-to-3D interface for transferring components to PlantVIEW
  • Expanded symbol libraries
  • Multiple database access
  • Workflow management
  • Enhanced image handling
  • Object linking and embedding (OLE) support
  • Report writer that can generate reports in hypertext markup language (HTML) for Web publishing
  • User-defined interface tools and interfaces to other engineering software applications

PlantVIEW - a 3D plant modeler that lets engineers create and extensively evaluate the operation of a complete, computer-based model of a new or existing plant before any financial commitments are made. PlantVIEW supports piping; structural; electrical; instrumentation; HVAC; and civil engineering disciplines by providing 3D geometric information; automatic generation of piping isometrics and associated bills of material; as well as integration with third-party tools.

Features for the PlantVIEW physical (3D) data manager system include:

  • Stored dimensional data on more than 16,000 standard plant components
  • 3D modeling creation and editing functions
  • Parametric equipment modeling and volumetric calculations
  • Pipe routing
  • Interference detection, display, and resolution
  • On-demand hidden line and shaded view generation of 3D volumes
  • User-selectable choice of drawing automation and drafting facilities
  • Interfaces to other engineering software applications
  • OLE automation and ODBC drivers for accessing standard desktop applications
  • -Ability to import/export 3D DXF and DGN files

Complementary PASCE modules include:

PlantCAD - links the PASCE plant model database to AutoCAD, providing drawing updates as design changes take place.

PlantLINK - combines PASCE’s database-driven 3D modeling system with Bentley Systems’ MicroStation applications.

As a native AutoCAD or MicroStation user, all annotations and dimensions created are linked directly to the 3D PASCE model in PlantVIEW. This linkage maintains associativity between PASCE’s database and the drawings created. All modifications made to the PASCE model are automatically recognized through PASCE’s configuration manager and implemented (if properly approved) in the appropriate drawings.

PlantWALK - the PASCE animation application for plant design review uses models created in PlantVIEW to produce realistic animated "walk throughs" of the 3D models. Users can use the virtual models to review, comment, verify, and simulate "what if" scenarios, such as construction sequencing, long before the design is approved for actual construction.

PlantWEB/PlantBROWSER - integrates into the Microsoft Internet Server and provides access to PASCE databases for Internet/intranet-enabled plant engineering in real-time for multi-users

PlantSTEEL - a structural steel analysis interface.

PlantPIPE - a piping analysis interface.

Data linkage between PlantSCHEMA, PlantVIEW, and the complementary PASCE modules means that the data contained within them is entered only once. Centralizing this data helps ensure the accessibility and consistency of current information by whomever needs it, regardless of the size of the or complexity of the plant model.

The PASCE product and data architecture also supports live, online links to instrumentation and control monitors in a plant, letting owner/operators use the database for operations, maintenance, and regulatory compliance throughout the life of a facility.

Prices for the basic PlantSCHEMA and PlantVIEW modules range from $15,000-$20,000. Contact EA Systems for specific prices on the PASCE complementary modules.

New Directions, New Products

Use throughout the total lifecycle is exactly the marketing message EA Systems is trying to get across to expand its customer base. Earlier this year, the company demonstrated work from a major project undertaken in cooperation with Mitsubishi Chemical Corp. (MCC). The project integrates the process evaluation and costing software from ICARUS Corp. (Rockville, MD) with PASCE for use by MCC, providing an integrated conceptual design environment with automation functionality for basic engineering and subsequent detailed design and operations.

It’s also the springboard for other products to be announced by EA Systems, thus moving PASCE further into the "conceptual design" phase of plant design with a new product called PlantCONCEPT. From the beginning, the PASCE suite of products were developed to store all plant information in a single, comprehensive database. PlantCONCEPT further extends this single database to include planning process data on the front-end of a project, thus virtually eliminating redundant data entry.

Mitsubishi evaluated many engineering and data management packages before choosing PASCE for their worldwide plant design work. The company will use PlantCONCEPT as the basis for a concurrent process engineering (CPE) environment for the conceptual phase of projects. EA Systems claims that PlantCONCEPT has more features than any other process and instrumentation diagram (P&ID) package. To back up this claim, EA Systems says that PlantCONCEPT is the only software tool that provides links to process simulation, cost estimating, and heat exchanger sizing software. PlantCONCEPT also automatically converts block diagrams to process flow diagrams (PFDs), then converts PFDs to P&IDs.

By automating much of the drawing process for conceptual designs, PlantCONCEPT also simplifies initial design review and approval. After conceptual design work is completed, the PlantCONCEPT module integrates the data in the early process planning with the PASCE plant information database for use in subsequent detailed design, operations, and lifecycle maintenance.

Plant data, and the reports generated by PlantCONCEPT, are incorporated into PASCE’s object-oriented, data-centric, 2D plant information database — PlantSCHEMA, an intelligent schematics and plant logic modeler.

Several PASCE customers are using current Internet/intranet technology to provide members of their staff with engineering drawings and current data from PASCE models. To address this need, EA Systems introduced PlantWEB and PlantBROWSER in late 1996. According to the company, this back-end server and client allow easy access to PlantVIEW (3D) and PlantSCHEMA (2D) data over the Internet or corporate intranet.

Innovation Drivers

Economic drivers are causing companies like American Electric Power (AEP) (Columbus, OH), Rohm and Haas, and LGEN to insist on delivery of a coherent data model for PLDM. Regulatory organizations, such as OSHA and the NRC, are establishing new regulations for safety and environmental compliance, which drives companies to demand more integrated toolsets. "They need software that can retrieve the data and give them logical connectivity to achieve compliance objectives," said Arvind Patel, vice president of sales and client management at EA Systems. "Until two years ago this market was focused heavily on plant design. Now, companies are asking for an electronic data model to use for plant operations and maintenance. They are asking how our model works, what are its benefits and how can we deploy this not just for engineering design but enterprisewide or even worldwide. It’s really a significant shift," he noted.

And it’s saving time and money with early adopters, which is leading other companies to look around. "Any company using process simulation packages could use PASCE — that’s a pretty significant market," Patel advised.

The company plans to expand its base of clients by rolling out a series of discipline-specific applications, such as intelligent P&IDs, elementary electrical drawings, and wiring and instrument loop diagrams. "We’re trying to offer something that is not traditional and that offers benefits in areas that aren’t tapped. We think the market is sizable for that, it brings us into domains where CAD historically has not been used," he said.

EA Systems is active in open systems groups like PlantSTEP Inc. and has its eyes open for new partnerships and joint product-development opportunities. "We don’t believe that we deliver a total solution in any one phase of the lifecycle," said Selden. "But we feel we’re part of the total solution. We are open to exploring alliances that address those other lifecycle phases or where we have complementary products with those of other companies."

"We are true believers in PLDM and fortunate that companies like Duke Power and Rohm and Haas have done it. Now we need to communicate the vision and make sure people understand that it’s not just a theory at this point. The results are real bottom-line savings," Selden said. The mixed blessing is that competitive software companies are also hearing the message, and migrating products to adopt a more data-centric look and feel. "However, we’ve had a two-year lead in terms of products and technology and we think we’ll maintain that lead."

For 20 years, A-E-C Automation Newsletter has followed the plant design industry, espousing a data-centric approach, and we’ve seen too many vendor companies fail in this heady endeavor. We are confident, however, that EA Systems is a survivor and will continue its successful ways, and the market they’ve targeted is sure to follow with similar successes.

Note: EA Systems has produced a 16-page white paper that does a nice job presenting the company’s products and services, and where it thinks this industry must go if it is to continue to meet the needs of the owner/operators of process and power plant facilities. It is entitled, "The Four Dimensions of Plant Lifecycle Data Management." For a copy of the white paper, contact Marie P. Telepneff, manager of marketing and communications, at 510/748-4855 or

EA Systems Inc.
980 Atlantic Ave., Suite 103
Almeda, CA 94501

Tel: 510-748-4700
Fax: 510-748-4717

Company Specialty:
Software and services for designing and engineerings complex industrial process and power plants, and managing information over the operating life of plant facilities.

Founded: 1986

Ownership: Privately held

ArchT — For Architects, By Architects (07jun97)

June 1, 1997

It isn’t often that software users (much less architectural software users) declare that the developers of products they use understand their needs, but we have run across one of these rare instances with a product called ArchT v13.5 by Ketiv Technologies Inc. We feel that a major reason behind ArchT’s success with architects is the fact that it is developed and enhanced, at least in part, by licensed architects. From the beginning, Ketiv has focused on creating architectural tools that can be used right out of the box, or customized to meet the needs of virtually any architectural office. ArchT is a shining example of this goal.

ArchT works within AutoCAD and provides enhanced architectural functionality to it. ArchT has been around the block, too, since 1985, so you can rest assured that it works well with AutoCAD for architectural work. It has been considerably overhauled and enhanced to the point where it is suited for just about anybody who needs 2D architectural documents with the flexibility to get 3D models and rendered images, pre-defined and customizable reporting and scheduling capabilities, plus architectural content.

As testimony to ArchT’s usefulness in the architectural arena, Ketiv recently signed a volume purchase agreement with Hellmuth, Obata, and Kassabaum (HOK) Inc., the largest architectural firm in the US for an undisclosed amount of AutoCAD and ArchT packages.

The First ARX Architecture Application

ArchT was the first shipping AutoCAD Runtime eXtension (ARX) application for architecture, initially running inside AutoCAD R13. Ketiv has been an Autodesk Authorized Developer with over 14 years of experience in AutoLISP, ADS, and ARX programming and architectural domain experience.

Ketiv claims that ArchT was the first object-oriented software product for architecture. Instead of manipulating simple geometry (lines, arcs, and circles), ArchT users actually design with objects (windows, doors, and walls). These objects are "intelligent" about what they are and how they interact with other objects in the drawing. For example, if the style of a window is changed, ArchT first redraws the window to meet the new specifications, and then redraws affected walls to accommodate the change.

Ketiv is an interesting company, in that it not only develops software products, but is also a major AutoCAD reseller, training, and support center, specializing in architecture. The company also offers a taste of mechanical design, as well. In addition, the American Institute of Architects (AIA) recently designated Ketiv as a Premier Training Center for fulfilling annual continuing education requirements.

Draw It Once And Refine It

Ketiv’s primary philosophy and motto is "Draw it once." We think a better tagline would be to "Draw it once, simply, and refine it as you go." Ketiv realized early that everybody hates repeatedly drawing the same architectural elements, so ArchT was designed to handle the geometry while users can experiment with the available options — creating and revising until a design solution is reached. Many architectural elements, such as doors, roofs, stairs, windows, etc., can be quickly drawn using ArchT’s library of standard styles. Styles can be created, edited, and previewed. ArchT also has several utilities for speeding production drafting.

For working quickly, you can draw in 2D and then display the design in 3D with a simple mouse click. ArchT’s unique methodology stores 3D data as 2D drawings are being created within style definitions and blocks.

Overall, ArchT is very easy to use with an interface that is intuitive and, as importantly, consistent.

General Requirements

For running ArchT, Ketiv recommends 32 MB of RAM, but we have heard few complaints from users who have only 16 MB installed. Of course, AutoCAD is also required; R13c4 or higher. ArchT is priced at $1,195 and comes with free 90-day tool-free technical support directly from Ketiv.

Contact: Ketiv Technologies, 800-458-0690, 503-252-3230,

Visio Enters GIS Market With Visio Maps Add-On (01jun97)

June 1, 1997

It seems that everybody and his brother is realizing that there is big money to be made in the burgeoning GIS marketplace and a constant stream of products keeps emerging. Some of these GIS products are "light" versions of bigger, monolithic products, while others have been a little too "light" and hardly worth the time, effort, or money to try and learn.

The reason that so many players have tried to enter the GIS playing field, of course, is money. For example, market research firm, Daratech, projects that in 1999, the worldwide GIS software market for all applications running on all platforms will exceed $1.5 billion. Dataquest, another market research firm expects sales of just GIS business applications will exceed $190 million by 1999.

While it is not difficult to understand why Visio is attempting to break into GIS, it is hard to really categorize what Visio has done with its new product, Visio Maps, but at first glance (we looked at a pre-beta version), they seem to have done many things right for getting real GIS technology into the hands of a potentially much bigger business customer base.

Partnering with ESRI

In no small undertaking, Visio has partnered with Environmental Systems Research Institute (ESRI) as an integral part of the overall Visio Maps strategy.

With help from ESRI, Visio Maps was designed to work in conjunction with and be compatible with ArcView. Organizations that use ArcView will be able to extend the usefulness of their maps by saving their project information in a format specifically designed to be read by Visio Maps.

We have wondered for some time if ESRI would ever be amenable to a GIS product suited for a much larger potential market; and they have now shown that they are. We feel that these technology and business moves are good ones that will benefit both Visio and ESRI.

How It Works

Visio Maps is an add-on product that runs on top of any of the company’s other core products — Visio Standard, Visio Professional, or Visio Technical.

Of course, users can’t expect Visio Maps to be as comprehensive as ESRI GIS products. Visio Maps extends the Visio product line so users can visualize their business information, showing geographic relationships and trends. It employs the same interface and drag-and-drop metaphor for creating and sharing maps and other geographic data.

Major Visio Maps Features

Because so much research was conducted before Visio Maps was actually created, it contains features that other vendors are likely to take notice of and "mimic." Some of the more prominent features include:

New SmartShapes Symbols - Visio Maps comes with more than 120 mapping-specific symbols of Visio’s trademark SmartShapes for creating and annotating maps.

Geographic Mapping Features - These include thematic rendering for more effectively presenting thematic maps; address matching to street level for pinpointing location information; find by location for zooming in and highlighting a specified location or feature; and spatial selection tools for querying data associated with a selected boundary.

Data Sets - A wide range of geographic-boundary and demographic data, such streets, rivers, cities, key landmarks, and population for use in creating thematic maps.

Wizards - Visio Maps Wizards help users step through the creation and use of data-driven maps in such a way that they don’t have to be database or GIS experts.

ESRI/ArcView Data Compatibility - ArcView users can extend the utility of their maps by saving their project information in a format specifically designed to be read by Visio Maps. Visio Maps also will read any standard ESRI Shapefile format data and import ArcView map projection information.

Lastly, because users will be able to see geographic trends in their business data, they can join this data to and from Microsoft Access, Excel, FoxPro, or dBase to geographic data included in Visio Maps.

Good Odds For Hitting Target Customers

Before creating Visio Maps, the company did extensive market research and discovered that users had specific needs for a desktop mapping application, regardless of industry or vertical market. These criteria were then transformed into the design goals for Visio Maps. With this in mind, Visio realized that its Maps product would have to provide the following:

  • A fast and easy way to create and publish maps
  • Flexible annotation capabilities to enhance maps for presentations
  • Easy-to-use and consistent interface
  • Compatibility with higher-end GIS software
  • Customizable for specific needs and market segments
  • Affordable and easy to learn

Visio realizes, and rightly so, that potential users of Visio Maps are diverse and span many industries, so the customer profile takes aim at basically two groups:

  • Current users of Visio who are already familiar with the interface, features, and capabilities. Since the company estimates that 70 percent of all business information generated today contains some type of geographic data, the more than one million current users of Visio will potentially discover new ways for creating, incorporating, and sharing maps.
  • Adjacent ESRI ArcView users who work with, but generally do not create, ArcView maps because of the high cost of acquiring and training associated with ArcView. These more casual users are not GIS professionals, but they need to view geographic data to perform tasks, such as how to optimize sales territories, where to stage promotions, what customers to target, where competitors are located, etc.

Visio’s partnership with ESRI helps ensure that Visio Maps users have access to the same data as GIS specialists using the ESRI products.


Visio Maps requires Visio 4.0c or later, Windows 95 or NT 4.0, and a CD-ROM drive. It comes on three CD-ROMs and includes the product, demographic data from ESRI, and other assorted goodies. The US street price for the Visio Maps add-on is $199. Bundled pricing is available when Visio Maps is purchased with a Visio application. As we went to press, the only bundle price we could get was Visio Maps bundled with Visio Standard, which is priced at $299. Visio Maps will begin shipping in July and will be generally available in August from corporate resellers, distributors, retail channels, and directly from Visio and ESRI.

We will keep a close eye on this product and will review a shipping version of Visio Maps and publish our findings either later this summer or early fall.

Contact: Visio Corp., 800-24-VISIO,

Silicon Graphics And Coryphaeus Software Team For Simulating Proposed Bay Bridge Designs (01jun97)

June 1, 1997

Proposed designs for a new San Francisco Bay Bridge were recently brought to life with the help of real-time 3D graphics technologies developed by Silicon Graphics Inc. (Mountain View, CA) and Coryphaeus Software Inc. (Los Gatos, CA). Three new designs for the Bay Bridge, damaged in the 1989 Loma Prieta earthquake, were digitally simulated using a new 3D urban simulation application called MetroSim from Coryphaeus running on Silicon Graphics Onyx2 InfiniteReality systems.

The bridge simulations are letting municipal planners, civil engineers, and government officials actually "fly through" the three designs being considered for the proposed $1.3 billion reconstruction of the landmark bridge’s Eastern Span — which extends from Oakland to Yerba Buena Island. The immersive and realistic simulations are visually more powerful than pre-created animations, because viewers can go wherever they want and see the bridge and its surroundings from any angle. They can experience what it is like to drive on the bridge, to pilot a ship beneath the structure, or fly above it in an airplane.

After an independent study commissioned by the California Department of Transportation (Caltrans) recommended that the Bay Bridge’s Eastern Span be rebuilt instead of retrofitted, the state’s transportation agency has been accepting design proposals for a new, earthquake-safe Bay Bridge.

Of the several designs submitted, urban simulations were created from the three designs that directly resulted from the study — the Skyway, a simpler and lower-cost option; the Single Cable Stay, a mid-range rebuild alternative; and the Double Cable Stay, a more elaborate "landmark" design. The three simulations took a total of four man-weeks to complete. After a series of reviewXxGET Bay Bridge seismic enhancement is scheduled to be completed by 2004.

To create the bridge simulations, engineers used MetroSim, a new real-time graphics application unveiled by Coryphaeus Software last month. With this software running on a Silicon Graphics Onyx2 workstation, engineers combined existing 3D wireframe models of each bridge design with a wide variety of real-world data. This real-world content included Bay Area terrain data supplied by the US Geological Survey; a 5-meter satellite image of the Bay Area and its surroundings from ImageLinks; bridge textures; high-resolution models of the nearby Golden Gate Bridge, Treasure Island buildings, and ships from Viewpoint DataLabs; and photorealistic skylines of both San Francisco and Oakland. The resulting simulations place viewers in an immersive and realistic environment to give them a true sense of the bridge’s appearance in its surrounding environment.

"Urban simulation vastly broadens the use of real-time 3D graphics technology beyond its traditional applications in defense, aerospace, and entertainment," said John Murphy, president of Coryphaeus Software. Drew Henry, director of advanced graphics marketing at Silicon Graphics added, "Since most of the cost of a major construction project is committed when a design concept is chosen, urban simulations can save project planners millions by giving them the best possible insight into the final result even before construction begins."

Coryphaeus Software’s MetroSim was developed specifically for modeling urban scenes that incorporate buildings, roads, rivers, trees, and vehicles. With these simulations, developers and planners can interactively visualize the many components, building types, and infrastructure that comprise cities, as well as entire cities. MetroSim integrates three Coryphaeus tools to provide a comprehensive urban simulation software solution — Designer’s Workbench for real-time 3D modeling, texturing, and model optimization; DXF Translator for importing existing graphics files; and EasyScene for establishing the environment and real-time display. Complementing MetroSim is EasyT, Coryphaeus’ tool for creating massive databases of data used for creating realistic simulations.

Acquiring PCs In Today’s Rapidly Changing Market (01jun97)

June 1, 1997

One of the questions we frequently are asked is whether an organization should purchase new PCs now or wait since performance keeps going up and prices are coming down. Obviously, one answer does not fit all situations. The right solution depends upon the age and configuration of existing hardware and the software you are currently using or would like to use.

In general, however, we believe that architects and engineers should be provided with the latest equipment an organization can afford and that older units be moved to people with less demanding tasks. It is somewhat shocking to find that many technical people have better systems at home than they have at work.

Too many financial managers believe that they are saving their companies money when they hold off upgrading systems that have not yet been fully depreciated. It does not take a large improvement in the productivity of an professional who costs $80,000 per year including benefits to justify a new computer that might be an order of magnitude faster than what is currently being used.

The Price/Performance Explosion

The accompanying table tracks the typical PC one could buy for about $3,000 during the past five years. We have used Gateway 2000 for this data since that company has been very consistent in its product offerings. For the past decade or so, the guideline in the computer industry was that price/performance doubled every two years. Welcome to 1997 where it appears that this ratio is doubling annually. A number of factors have caused this change:

  • Intel has significantly improved the performance of its microprocessors to the point where they compete effectively with all except the best RISC processors used in UNIX workstations.
  • In order to maintain its market dominance, Intel has aggressively priced its new microprocessors. When the Pentium first came out, a 60-MHz version sold for nearly $1,000. Less than four years later, a 233-MHz Pentium II (see related article in this issue covering this new microprocessor and some of its competitors) sells for just $636 in similar quantities. This is nearly an order of magnitude increase in price/performance during this short period of time.
  • The huge volume of PCs sold has resulted in incredible economies of scale. While less than 500,000 UNIX workstations will be sold this year, over 80 million PCs will be shipped worldwide.
  • Memory costs have plummeted. Until about 18 months ago we were paying about $50 per megabyte for DRAM memory. Supply caught up with demand and prices dropped to under $10 per megabyte. Machines with 64 MB memories are readily available for less than $3,000. In many cases, a large memory can have a more significant impact on performance than microprocessor speed. Likewise, disk storage costs have also dropped like a rock.
  • Graphics accelerator cards have come down in price to the point where the PC manufacturers include them in the base configuration.

Getting The Most For Your Dollar

Having spent a fair amount of time watching the PC evolve, we have a few suggestions on procuring new PCs.

  • You are better off buying a system somewhat below the top-of-line and then replacing it more frequently than you would if you bought a more expensive system.
  • Install plenty of memory. Most CAD and GIS packages thrive on memory and at $8 to $10 per MB, you can easily afford 64MB or even 128MB.
  • Do not ignore a high performance backup device. With 3.2 GB disks commonly used today, we recommend a 4 GB cartridge tape drive.
  • Buy the entire system from one source. Your time is too valuable to be spent configuring hardware and software for your computer to work.

It’s Show Time! (01may97)

May 1, 1997

The A/E/C Systems conference and exhibition is this industry’s showcase event for the technologies, products, and personalities that are shaping it today and propelling it toward the future. Historically, the A/E/C Systems show has been the launch venue for a number of very prominent products over the years, and this year will probably prove this true once again.

We are looking forward to attending A/E/C Systems this month in Philadelphia scheduled for June 16-19. This year’s event is expected to draw more than 25,000 attendees and between 400 and 500 exhibitors. This show can be a grueling ordeal, just based on the sheer number of vendors and products, but it does let us see most of the major AEC and GIS players under one roof for objectively evaluating their offerings.

We strongly encourage you to also consider attending so you can evaluate the products and question the vendors as an informed potential buyer, because one product does not a comparison make. To really get a good feel for what different products can really do, you have to see them operate with the same data, constraints, or parameters (all preferably yours) side by side on a level playing field.

Beware, though, that just about any vendor worth its salt can make its product look good -- especially if it’s the only one being evaluated and "verbally compared." Even in this verbal comparison, you must be ready to spar with the tough questions. Try to be defensive and objective, because many vendors will tend toward being offensive and subjective about their products, especially when pressed how their products stack up against the competition.

As much as we would like to think otherwise, and as much as vendors would like you to think otherwise, all software products, AEC or otherwise, are not created equally. Since this is a market that is in a seemingly constant state of flux, many customers often approach vendors with a certain air of skepticism and uncertainty. Unfortunately, these feelings all too often play into the hands of unscrupulous sales people who try to assure potential customers that only they have the solution that the customer before them is seeking. In effect, this all too often opens the door to snake oil marketing ploys by vendors. Of course, it’s always the "other" vendor’s direct sales people and VARs who are at fault as the snake oil purveyors. Surprisingly, though, it is often those finding fault with other vendors that fail to support the claims they are making for their own products.

So, then why does the best product often have few takers? Sadly, in this industry, as in virtually all others, it’s often not the content of the product that sells, but the packaging. Once again, it is not a case of who has the best product, but rather who does the greatest volume and better job of splashy, memorable marketing that comes out the winner in the product sales game.

Jeffrey Rowe, Editor

Digital Signature Software For CAD

May 1, 1997

Computers and software applications have dramatically accelerated the pace of many activities that were previously handled physically. This includes design drawings, 3D visualizations, and document management, but some processes such as the approval of new CAD drawings to manufacturing largely remain in the physical world due to inadequate software functionality.

However, it now appears that at least one company, Silanis Technology, has developed an adequate software solution that is reliably secure, does not require a significant change to a company’s procedures, is easy for people to use, and still dramatically speeds up the process for obtaining approvals with signatures (albeit digital ones).

What’s Common Today

The two key aspects for a document’s approval in the physical world are its originality and the reliability of the handwritten signatures that approve a document to move to the next stage. Any changes made to that drawing afterwards would be easy to spot.

A bitmapped signature pasted onto a CAD file is easy to copy and the file can still be changed later with the approving signature intact. A signature could be input via a digitizer pad and verified against a database, but the file could still be altered afterwards even with the approving signature in place. Neither method effectively replaces the physical method.

Digital signatures based upon the currently unbreakable public-private key software (PGP or pretty good privacy) are secure along with their encrypted documents. However, Silanis correctly points out that the many forms programs (Lotus Notes) that use public-private keys require companies to change their procedures into a totally electronic paradigm without decent provisions for printed copies.

ApproveIT Leads The Way

Silanis’ ApproveIt program mimics the physical paradigm with digitized handwritten signatures. Once a document such as a drawing has been signed—with a person’s captured signature file or signed real-time via a digitizer pad—the ApproveIt software monitors the status of the document. If the document is modified later, the signature will not display on the computer screen in the appropriate place on the document nor will the signature print with the document.

The signature and the approval rated information are embedded in the actual document. As a result, the document carries the signature and security monitoring with it even if the document leaves your computing environment. There are no ties to network operating systems, network protocols, e-mail systems, PDM or EDM systems. All ApproveIt applications are client-based with no server software required.

Signatures are input preferably into the system on a digitizer pad, and the resulting high resolution file (1200 dpi) can be reused. The file is encrypted and can only be used once its password is provided. The system can also require a fresh signature for each approval, although this would require a company to purchase enough digitizing pads to make it convenient.

ApproveIt for CAD 2.1 ($399 per copy) only works with AutoCAD R12 and R13 running under DOS and Windows 3.1/95/NT. AproveIt is also available for Microsoft Word 6.0-7.0 and Excel 5.0-7.0, and WordPerfect 6.1 for $149. Site licensing is also available

Silanis plans to ship a software toolkit to integrate ApproveIt’s electronic signature and encryption capabilities into other applications. This should include other CAD programs such as MicroStation as well as CAD viewers, enterprise document management (EDM) systems, workflow, e-mail, and groupware.

The toolkit is a set of application programming interfaces (APIs) bundled in a dynamic link library (DLL) that can be accessed by several programming languages. It sells for $1,995 plus $99 per runtime license, and supports Windows 3.1x, 95, and NT (but not DOS).

Bentley Systems and Netscape recently announced a joint effort to bring digital signature security to large-scale engineering projects on corporate intranets. That effort is focusing on the encryption of the CAD files, but it does not appear to provide the means to completely replace physical routing of paper documents and appropriate sign-offs from key personnel. In the meantime, Silanis appears to be the only game in town.

Contact: Silanis Technology, 888-745-2647, 514-683-3232,

OpenGIS Consortium Making Real Progress Toward A Real GIS Standard (01may97)

May 1, 1997

The Open GIS Consortium (OGC) is a membership organization dedicated to open system approaches to geoprocessing. Through its consensus building and technology development activities, OGC has had a significant impact on the global geodata and geoprocessing standards community, and continues to promote a vision of OpenGIS technologies that integrate geoprocessing with the distributed architectures of the emerging global information infrastructure. OGC recognizes that the evolution of new technologies and new business models are closely related, so by means of an open and formal consensus process, OGC is creating the OpenGIS Specification, an unprecedented computing framework and software interface specification which is a necessary prerequisite for geoprocessing interoperability.

OGC was founded primarily to address the following needs:

  • To integrate geographic information contained in heterogeneous data stores whose incompatible formats and data structures have prevented interoperability. Up to now, this interoperability barrier has seriously limited the ultimate usefulness of GIS technology.
  • Improved access to public and private geodata sources.
  • The ability for agencies and vendors to develop standardized approaches for specification of geoprocessing requirements for information systems.
  • With the GIS industry, to incorporate geodata and geoprocessing resources into national information infrastructure initiatives. These geographic resources must have the ability to be found and used as easily as any other network-resident data and processing resources. To accomplish this level of integration, the industry realizes it needs to synchronize geoprocessing technology with emerging information technology standards based on open systems concepts, distributed processing, and object/component-based software frameworks.
  • The need to preserve the value of legacy geoprocessing systems and legacy geodata while incorporating new geopksGET>

In short, OGC acts as the technical catalyst for merging GIS, GPS, CAD, earth imaging, and spatial databases with virtual reality, multimedia, network computing, and non-spatial desktop applications. It is also forging working relationships for delivering geospatial applications to business, government, education, and entertainment.

Heady Aspirations From Humble Beginnings

OpenGIS is defined as transparent access to heterogeneous geodata and geoprocessing resources in a networked environment. From the beginning, the goal of the OpenGIS Project has been to develop a comprehensive open interface specification that lets developers write software that provides open, interoperable capabilities. The OpenGIS Project began in 1993 with limited support from a few federal agencies and commercial organizations who funded meetings to discuss the feasibility and possible scope of the proposed OpenGIS Specification (OGIS) and specification-writing project. After the original participants determined that a useful specification could be developed, OGC was founded in August 1994 to provide a formal structure and process for developing the specification.

Today, OGC manages the OpenGIS Project as a formal consensus process involving key organizations in the commercial, academic, and government sectors of the geographic information community. All members participate in the OGC Technical Committee and its working groups under the supervision of OGC's Management Committee which develops OGC's business plan and makes all final decisions regarding the scope of the specifications and the project.

The participation of these individuals indicates the degree to which OpenGIS technologies are seen by the organizations they represent to be a necessary next step toward creating a global open system geoprocessing architecture.

OGC Technical Committee

The OGC Technical Committee is the primary operational unit of the OpenGIS Project. It is comprised of the technical representatives of all OGC member organizations and is charged with creating the OpenGIS Specification. The Technical Committee does the bulk of its work through its Working Groups.

The OGC Technical Committee has created an abstract specification, a single detailed guide for writing interoperable geoprocessing software. In order for products to emerge, this needs to be implemented on industry-accepted distributed computing platforms (DCPs), such as OLE/COM, CORBA, and the Internet's http and Java standards. While the Technical Committee continues to develop the remaining parts of the abstract specification, it is also working with vendors and research groups through an RFP (request for proposal) process to develop DCP-specific OpenGIS implementation specifications.

Software products that are compliant with an implementation specification will be interoperable within a DCP. In addition, every effort is made to conform the different implementation specifications and to work with the DCP developers to achieve maximum geoprocessing interoperability between DCPs.

At both Management Committee and Technical Committee levels, OGC maintains an Application Integration structure which focuses on coordinating activities designed to help technology providers address application-specific needs that are common to technology users in industry sectors. The goal of this committee structure is to evaluate, test, implement, and extend the OpenGIS Specification to ensure its practical utility in a wide variety of application areas. By inviting the participation of user communities, OGC can provide community-tailored specifications which enable diverse vendors and integrators to build interoperable products and service environments.

Summary Of OpenGIS Technologies

The OpenGIS Specification is a comprehensive software architecture specification that provides a standard way to represent all kinds of geodata in software and a common set of services to support distributed geoprocessing across heterogeneous hardware platforms. Programming interfaces based on this specification will enable true interoperability between applications on the desktop, and they will enable access to heterogeneous geodata and geoprocessing resources across local and wide area networks.

The OpenGIS Specification releases GIS, remote sensing, and other geoprocessing disciplines from the constraints of proprietary and incompatible data formats and isolated applications, and moves into the world of software components and network-based computing. The OpenGIS Specification is like other distributed object-oriented software systems in its basic approach, but it is the first large-scale application of object technology for managing spatial data in the contexts of global and national information infrastructures.

The OpenGIS Specification specifies a well-ordered environment designed to simplify the work of both developers and users. Developers adhering to the specification will create applications able to handle the full range of geodata types and automatically negotiate vast geodata and geoprocessing resources on a network. Users of geodata will be able to share a huge networked data space in which all kinds of spatial data will be usable without difficult and time-consuming batch transfers and conversions, even though the data may have been produced at different times by unrelated groups using different production systems for different purposes.

The OpenGIS Specification architecture provides:

  • A single "universal" spatial/temporal data model — the Open Geodata Model (OGM).
  • A set of services for manipulating data represented using the OGM — the OGIS Services Architecture.
  • A Geographic Information Communities model that addresses the problem of different communities of users using different semantics for the same spatial features. This provides an efficient means for data discovery using catalogs and an efficient means for data integration using semantic translators.

The OGM is the core component of the Specification. It consists of a hierarchical class library of geographic information data types that comprise the shared data environment and unified data programming interface for applications. Every geoprocessing system (GIS, earth imaging, digital cartography, navigation, etc.) has a geodata model where terrestrial objects and events are represented in software. Basically, the OGM provides a means for mapping one geodata model to another.

The OGIS Services Architecture is a consistent open development environment characterized by a reusable object code base and a set of spatial access and spatial processing services. This architecture supports complex query processing and also specifies standard methods for requesting and delivering geospatial transformations and processing tasks. It facilitates transformations between private data and OGM constructs, as well as coordinate conversion and raster/vector conversion. Tools built using these catalog systems will bring to geodata access the same degree of expanded capability that Web browsers and HTML have brought to the much simpler world of text data.

Much of the momentum for the OpenGIS project comes from a need to share geographic information more effectively and to improve the efficiency of communication between individuals and organizations who not only store and manipulate geographic information in different ways on different computer systems, but who think and talk about and visualize geography in very different ways. Undertaking to establish effective data communications within a well defined technical context poses a reasonably limited problem. Undertaking to establish means for sharing information with little or no loss across the technical and human barriers that exist within and among diverse human groups and institutions poses a problem of much greater scope. OGIS must ultimately help solve both the technical and the larger human problem.

The OGC Technical Committee has devised a Geographic Information Communities model to address this problem that provides a framework that uses OpenGIS technologies to dissolve many of the barriers that prevent sharing of digital spatial data and spatial processing services. Some barriers, of course, cannot be overcome by technology. But when semantic translators and other methods become widely used, Geographic Information Communities will be able to see clearly what they might do at the organizational level to overcome such barriers and make their spatial information systems more interoperable with those of other Geographic Information Communities.

Progress Continues Toward Open Geoprocessing

In January, OGC announced that it had received excellent responses to its Requests for Information (RFIs) which are part of the OpenGIS Interoperability Specification effort. The RFIs cover geographic information catalog services and imagery.

Seven organizations submitted information in response to the OpenGIS Catalog Services RFI. These submissions suggest requirements to include in an RFP for standard mechanisms for geographic information in a networked environment of heterogeneous spatial databases. Among the submitters were the ISO (International Standards Organization) Cataloguing Project Team, the US Federal Geographic Data Committee, the US Department of Defense's National Imagery and Mapping Agency, and the Committee on Earth Observation Satellites.

Space Imaging (Thornton, CO), SPOT Image (Reston, VA), TRIFID (St. Louis, MO), and the US DoD National Imagery and Mapping Agency (NIMA) responded to OGC's OpenGIS Imagery RFI to ensure completeness of the imagery parts of an RFP for specifications that will define coverages that include images (such as satellite images) and certain maps (such as digital elevation maps) where each point has a distinct value.

David Schell, president of OGC, said, "Geospatial catalog services will work underneath applications that are as easy to use as a Web browser, and the OpenGIS Project's work with coverages will enable all types of geospatial data to be queried with the same user-friendly tools. Our members are dealing with very sophisticated technologies, but they have their eyes on the non-expert user."

Late in 1996, geographic information system (GIS) vendors cooperated to submit proposed OpenGIS Specifications in response to OGC's first Request for Proposals (RFP) which addresses "simple geometry," (points, lines, and areas) and attributes (such as soil type). Compliant products expected later this year will be able to respond to each others' spatial queries.

Diverse Membership Speeds Accomplishments

The total number of OGC members now exceeds 90 with most of the major design/engineering automation and GIS vendors represented, as well as other organizations from the private and public sectors and academia. Even with such a diverse group of members, cooperation between the members is yielding real progress being made on several fronts.

Software and computer vendors, integrators, data suppliers, telcoms, universities, government agencies, and industry associations join OGC primarily for the following reasons:

  • To quickly enact distributed geoprocessing standards, which will not happen without their help.
  • As technology users, to shape geoprocessing standards to meet their needs; plan intelligently for technology acquisition and technology-induced organizational change; and evaluate and begin working with technology providers.
  • As technology providers, to create sound and timely business and development decisions; ensure that the OpenGIS Specification meets their needs; begin development of OpenGIS-compliant products as soon as possible; form relationships that help their businesses during a risky period of change; and solve shared problems.

As an example of progress, earlier this year a group of GIS technology providers including Bentley Systems, demonstrated a CORBA/Java-based method for interoperability of GIS data and systems based on object-oriented technology. The software architecture, was presented to the entire OGC technical membership community in response to their formal request for a CORBA compliant implementation. Bentley’s submission was developed in cooperation with Genasys, Lockheed Martin, Mitre Corp., Oracle, Sedona Graphics, and Sun Microsystems.

David Schell, president of OGC said, "We are pleased that Bentley has been such an active and contributing participant in our specification activity. It was gratifying to see the working prototype of the CORBA-based interface put forward by Bentley."

This implementation is significant because CORBA could well become the preferred communication vehicle for sharing GIS information between systems. Its architecture of multiple platform support is compatible with existing GIS systems and the trend toward data-servers and Internet clients. If accepted by OGC, the implementation will become the industry’s CORBA-based interoperability benchmark to be used between member GIS systems.

The CORBA implementation was based on Bentley’s ProActiveM technology and an objective version of MicroStation GeoGraphics that offers comprehensive interoperability of GIS data and systems.

OGC is creating the Open Geodata Interoperability Specification (OGIS), a set of software specifications for sharing geographic information across tools and organizations. Last summer, OGC issued a request for proposal (RFP) for interface standards in the areas of CORBA, OLE, ODBC, and the Internet. Final decisions on these standards are planned for later in 1997.

In January, Microsoft Corp. (Redmond, WA) became a Principal Member of the Consortium. As a Principal Member, Microsoft has a seat on OGC's Management Committee, which oversees the OGC Technical Committee that plans cooperative industry development programs.

Carl Stephen Smyth, Lead Geographer in Microsoft's Geography Product Unit, said, "We were happy to join the OGC industry effort. The goal of interoperable geo-information is very important to us and we look forward to helping ensure its early commercial success."

Consortium members recently showed cooperation in responding to OGC's first RFP for simple geographic features by creating a draft engineering specification for standard interfaces that will enable diverse OLE/COM-based software systems to access each other's geographic data. Competing GIS vendors Intergraph (Huntsville, AL), Autometric (Alexandria,VA), ESRI (Redlands, CA), Laser-Scan (Cambridge, UK), MapInfo (Troy, NY), Smallworld (Englewood, CO), and integrator Camber Corp. (Huntsville, AL) met several times to merge their proposed specifications into a single submission for Microsoft's OLE/COM platform.

Intergraph led the team that responded to the RFP for OLE/COM. Preetha Pulusani of the Infrastructure Product Center, Intergraph Software Solutions, explained, "Intergraph has been closely associated with and witness to the tremendous growth of 32-bit Windows in our market segments, ever since our early adoption of Windows NT. The significant investment we made on this platform beginning in 1992 has yielded extremely positive returns and user acceptance. Software based on Windows NT is now the fastest growing segment of the GIS market."

"We are excited to have Microsoft in the Consortium. Like all OGC members, Microsoft has a strong interest in network-based computing that involves geographic information," said OGC’s David Schell.

Another prominent GIS/CAD provider, Autodesk, was an early OGC supporter and member that has for some time led an effort to open standards with its own DXF data file transfer format. It has also instituted into its products the Industry Foundation Classes (IFC) and Guidelines, which were created in conjunction with the Industry Alliance for Interoperability (IAI) within the AEC industry. Autodesk is compelled to support the OGC standards throughout its entire mapping and GIS product family. Because more maps reside in Autodesk’s DWG format than any other, the company’s reason for involvement in OGC goes without saying. "Industry growth has been hampered by competing proprietary technologies, and we are committed to creating an environment where GIS information can be freely shared, regardless of platform," said Joe Astroth, vice president of Autodesk’s GIS Market Group.

This statement is especially noteworthy because until now, proprietary, complex, and incompatible data formats and non-interoperable geographic processing systems severely limited the use of digital geographic information and the growth of the GIS market. OGC is working to ensure that geographic information of all kinds will be able to fulfill its potential role in emerging national and global information infrastructures.

The Importance Of OGC

The continuing work of OGC is extremely important, especially in light of the fact that new GIS products and users are proliferating. It is, therefore, essential that standards be established for data that is interoperable and useful between different users, products, and platforms. The GIS industry itself realizes the importance of the outcome of the OGC and its members are exhibiting unusual behavior toward that end — they are cooperating to meet the needs of users in the currently mixed up world of too-often proprietary GIS data.

Since we feel that the work of the OGC and its implications are of vital importance to many of our readers, we will devote considerable space to this endeavor as the standards for geodata and geoprocessing evolve over the next several months.

In our next installment, we will cover how the Federal Geographic Data Committee (FGDC) is developing the National Spatial Data Infrastructure (NSDI) as a standard for ensuring consistency in digital geospatial data.n

Contact: OpenGIS Consortium

Bentley Systems Builds Data Bridge To The 21st Century With Its Geoengineering Strategy (01apr07)

As many of the vendors of GIS and design automation technology look toward the next millennium, many of these same vendors are trying to figure out the best way to get there. This isn’t the case, however, with Bentley Systems. They are poised for the future and are posturing themselves very well with a number of new strategic technologies, products, and alliances — and they are doing it their way.

In many respects, Bentley has always had its own way of looking at things, and GIS is a perfect example of this. What is commonly referred to as a geographic information system (GIS) in most other circles is termed geoengineering in Bentley vernacular, although Bentley’s term seems to be gaining wider acceptance. The company defines geoengineering as the convergence of engineering and planning technologies that integrate the precision and databases of CAD with the spatial analysis and planning capabilities of GIS. In other words, geoengineering is sort of a CAD/GIS hybrid. This broad definition, then, lets Bentley provide a wide range of products that address the entire life cycle of infrastructure and facilities projects, including planning, design, engineering, analysis, construction, operations, management, and maintenance.

Bentley Beginnings

Bentley was founded in 1984, offering Intergraph IGDS users a lower cost method for adding additional seats to their Digital VAX 11/780-based systems. The software provided by Bentley enabled these users to accomplish the same functionality as before, but with lower cost terminals. This software evolved into what we know today as Bentley’s core technology — MicroStation. Intergraph acquired a 50-percent interest in Bentley Systems and by the late 1980s it was the basic graphic system supporting Intergraph’s applications in nearly all the markets in which the company participated. Until the end of 1994, Intergraph was fully responsible for the sales and marketing of MicroStation worldwide.

Since Bentley took over responsibility for the sales and marketing of MicroStation from Intergraph two years ago, the company has quickly matured into a "heavy hitter" developer and marketer of technical solutions for the architecture, engineering, and GIS communities. From less than 200 people on board at the end of 1994, Bentley has grown to be a 700-person organization with over $120 million in annual revenues. Bentley has more than 250,000 users around the world; over 65,000 subscribers to Bentley SELECT, the company's comprehensive service and licensing program; more than 600 independent software developers (ISDs) providing applications for MicroStation; and more than 500 value-added resellers (VARs) for MicroStation worldwide.

Bentley’s goal is to maximize the productivity and collaboration of large-scale geoengineering projects. These projects typically involve complex processes with many users, a multitude of information, and long life cycles. Bentley offers engineering software products and services that uniquely scale to the entire engineering process.

Bentley's MicroStation products offer users high-performance solutions for their specific needs. MicroStation includes a full range of focused products for design, planning, drafting, review, maintenance, and field operations. The unified architecture of MicroStation ensures that all products fully interoperate. Also, Bentley offers engineering-level applications for productivity far beyond traditional tools.

MicroStation and its MicroStation Development Language (MDL) also serve as an integration and development environment. The products can hold and securely manage the full range and diversity of large-scale project data. MicroStation also integrates with enterprise information systems.

The Engineering Back Office

Recently, Bentley has expanded its focus beyond design/engineering to also address the integration of engineering-related software with the rest of an organization’s information technology (IT) infrastructure. This effort really began last November when Bentley announced the start of its Engineering Back Office (EBO). Topping this strategy are a new line of Internet/intranet-enabled middleware products, called ModelServers, as well as agreements with Netscape and Oracle. These moves hold major significance because, for the first time, engineering organizations can unite their desktop systems and data with enterprise IT systems and databases.

The Engineering Back Office taps the emerging IT infrastructure of the Internet and three-tier client/server technology as the basis of connectivity and sharing information. The new server products, including ModelServer Publisher and ModelServer Continuum, allow engineering data to be stored in corporate databases, integrated with enterprise data, and served to enterprise clients, including desktop applications and Web-browsers.

The company demonstrates an excellent understanding of the ongoing transition from traditional mainframe computer solutions, to client/server computing during the past decade, to today’s three-tier client/server approach to system architecture. Software basically can be segmented into three general categories:

  • Presentation – What the user works with on the desktop.
  • Applications – The software that responds to user inputs and, in some manner, manipulates stored data.
  • Storage – File and database management tools.

These three types of software can exist on one or more computer systems. Many contemporary systems use a PC or workstation client for both the Presentation and Application code, with a separate server for Storage. The three-tier concept attempts to reduce the amount of software on the client and creates a new layer called an Application server that sits between the clients and the Storage servers. The software utilized on the Application servers is often referred to as "middleware."

Although this structure seems much more complex than simply networking a bunch of desktop computers together, there are some distinct advantages. These advantages include:

  • Many enterprise IT departments are moving to the three-tier architecture. Adapting engineering and GIS software to the same concept will facilitate integrating the two parts of the organization.
  • A modular three-tier approach makes it easier to change one element of a computer solution without impacting the rest of the system.
  • Database changes can be made transparent to the clients.
  • This three-tier approach is a natural concept for Web-related solutions.
  • Security is enhanced since the client software does not have direct access to database information.

ModelServer – Bentley’s New Middleware

Bentley’s MicroStation product line will continue to be the company’s primary software for desktop clients. Server-resident products, however, will carry the "ModelServer" designation. The name reflects the fact that this software responds to requests related to engineering models. One major objective is to enable MicroStation users to access data in "foreign" formats and to enable users of other graphic systems to access MicroStation files. This needs to be done without worrying about which release of a particular package created a given file and without the need to overtly translate files from one format to another.

Beginning in the early 1990s a new software architecture, called three-tier client/server, became popular. In the three-tier C/S architecture, portions of the application layer are moved to a new set of server programs, called application servers, also known as middleware servers. As a result, the client programs can be much thinner and more flexible. Rather than directly connecting to data servers, clients communicate with intermediate application servers, which either process requests locally or forward them to other application servers. Application servers, in turn, communicate with data servers for data storage and retrieval.

The three-tier C/S model offers several advantages over the desktop model. Database designers have far greater flexibility in changing the implementation since changes can be totally transparent to the clients. Since key application results can be calculated on servers controlled centrally, clients are not required to understand all of the subtle nuances of such calculations, including proper handling of error conditions. Clients in a three-tier system are designed to be as thin as possible, with much mission-critical application processing and data synchronization performed on application servers. Application developers have more flexibility in optimizing performance and network traffic by distributing the application load between the client and the server.

Internet browsers connected to Internet servers are a perfect example of a three-tier C/S system. The program that responds to a browser's request for a uniform resource locator (URL) is an application server that attempts to locate and potentially reformat data from a data server. Alternatively, it might attempt to create the response locally by executing other server resident programs on behalf of the browser.

The benefits of the three-tier approach are evident in the context of Internet browsers, the ultimate thin client. The same program can be used to execute a virtually unlimited number of application server programs, with very little local configuration dependencies. Almost all Web pages can be accessed from any browser program, regardless of the platform on which it runs. Of course, there are exceptions such as plug-ins that are browser and platform-specific, but for the most part the Web is based on a single, platform-independent standard, HTML.

In the last year or so, however, a new twist has been added to the Internet C/S architecture—Java. Java can be viewed as a portable application layer that moves between application servers and desktop clients on demand. This is one way of overcoming the conflicting goals of keeping clients thin while performing as much appropriate application processing local to the client. However, to accomplish this, Java applications must be portable across all potential client platforms.

The ModelServer Product Line For The Engineering Back Office

ModelServer Publisher — was the first Engineering Back Office product. The server product electronically publishes Bentley’s MicroStation or Autodesk's AutoCAD drawings, maps, and models to any Web browser-based desktop. ModelServer Publisher works on-demand from centrally stored data and guarantees that published information is accurate and up-to-date. Unlike file conversion "plug-ins", ModelServer Publisher has zero cost-of-administration for desktop users.

ModelServer Publisher and Bentley’s Engineering Back Office are based on Internet software server technology from Netscape and engineering software from Bentley. The two companies have agreed on Bentley integrating Netscape's FastTrack Server and Enterprise Server software products into the Engineering Back Office.

ModelServer Publisher server-based software converts engineering data in formats such as MicroStation DGN and AutoCAD DWG files into a Web-based format on demand, enabling these files to be viewed by Web browsers, such as Netscape Navigator or Microsoft Internet Explorer. Beyond MicroStation and AutoCAD, it also handles many data formats found in CAD and GIS environments, such as SVF, CGM, VRML, and others. It will also include template pages for easy set-up by a Webmaster, and can be extended with Java applets. Using HyperText Markup Language (HTML) techniques, the software links the user to the actual drawing file without the need to install MicroStation or similar software on the client. ModelServer Publisher also converts all the applicable reference files, fonts and symbol libraries to a Web-based format.

We see ModelServer Publisher as a useful tool in moving beyond personal productivity applications to an enterprise workflow solution. It can be used in an internal intranet environment or accessed across the Internet with the same tools. In fact, a user need not know where the data is stored to use this package successfully.

Document security is a growing concern in the technical community. Current collaborative engineering solutions often involve transferring entire source documents across the network to others who need to view the information. Once the drawing file is transmitted, the creator of that file no longer has control over what subsequently happens to it. ModelServer Publisher minimizes this problem by simply providing access to the drawing to a remote browser user, it does not actually transmit the file to that user.

There will initially be two versions of ModelServer Publisher. A single active channel version that incorporates Netscape's FastTrack Server software and a multi-channel version that incorporates Netscape's Enterprise Server. A single active channel means that one person at a time can access the data, but since this takes very little time, many individuals can actually share the system. The multi-channel version provides access to multiple users at the same time.

ModelServer Continuum — is the second major product of the Engineering Back Office. This server product creates a "contiguous" database of engineering and enterprise data by storing engineering maps and drawings in corporate IT databases and "serving up" data to MicroStation application and Web clients.

Because ModelServer Continuum utilizes corporate IT databases, it breaks down the walls between GIS/engineering and enterprise data. Local and federal government agencies, utility companies, plant owners, and large corporations who manage engineering assets will be among the first to create a "total information system." Such a system, which contains engineering, spatial, and enterprise data, will help everyone in those enterprises make far more informed decisions.

ModelServer Continuum also boasts advanced transaction management features to meet the special multi-user workflow demands of large engineering projects. First, it minimizes collisions through record-level locking, an advance over file-level locking. Second, it includes long-transaction management, which coordinates and resolves collisions that inevitably occur in extended work sessions in a multi-user environment. Lastly, it features project branching, which allows users to experiment with new ideas and later synchronize their changes with the project if necessary.

Last month, Bentley began beta shipments of ModelServer Continuum product supporting Release 7.3.3 of Oracle Universal Server Spatial Data Option (SDO). The resulting combination is the first commercial information system that spans enterprise and engineering data. ModelServer Continuum serves as an enterprise-wide engineering information broker connecting engineering client software from Bentley and other suppliers to data stored in Oracle Universal Server using SDO. In other words, Continuum acts as an application server on top of SDO. Supported clients include MicroStation GeoGraphics, MicroStation GeoOutlook, as well as ESRI’s ArcView.

Because ModelServer Continuum interoperates with ModelServer Publisher, it can deliver engineering data on demand to any desktop web browser. Organizations need maintain only one copy of their data. The collective information now including engineering and spatial data can be shared throughout their extended enterprises.

ModelServer Continuum also provides multi-user access to engineering information with short- and long-term transaction management. It includes Bentley’s Open Engineering Connectivity (OEC), an API that makes all server functionality available to other applications.

MicroStation-based clients can use ModelServer Continuum to maintain all graphical and non-graphical data in a standard relational database, rather than in pre-segmented data files. When an edit session is initiated, the applicable data is extracted from the database, and a temporary set of design files is created. The MicroStation applications, such as MicroStation GeoGraphics, are used as they would be today, and when the edit session is completed, ModelServer Continuum places the modified information back in the database.

ModelServer TeamMate — is a server-based implementation of the MicroStation TeamMate product for document and workflow management of file-based information. ModelServer TeamMate is designed to provide uniform control over access, authorization, and revisions to files from both MicroStation-based and Internet browser-based clients. ModelServer TeamMate includes the same workflow coordination tools incorporated in the MicroStation TeamMate program. In fact, existing MicroStation TeamMate clients and projects will work with ModelServer TeamMate without change.

By implementing ModelServer TeamMate as an application-server program, access to all file-based project information can be easily and securely granted for all file types to all clients without requiring complicated client-level programming libraries.

The current Micro-Station TeamMate document management package is file-oriented and is typically installed on all user systems. ModelServer TeamMate is a server-based implementation that provides uniform control over access, authorization, and file revisions from both MicroStation-based and Internet browser-based clients. Like ModelServer Publisher, ModelServer TeamMate transparently handles translations between MicroStation and AutoCAD formats. ModelServer TeamMate supports import and export of DGN, DWG, DXF, IGES, and STEP files. This product is expected to be available around mid-1997.

ModelServer Publisher is priced at US$9950 per server for a single-channel version which serves one client at a time. It includes a copy of Netscape FastTrack Server. The multi-channel version of ModelServer Publisher, for serving any number of concurrent clients, is priced at US$24,500 per server. It includes a copy of Netscape Enterprise Server and is currently available. ModelServer Continuum running under SunOS is in a beta release now and priced at US$37,500 per server and is expected to be available in Q3 1997. Plans also call for the product to integrate with other RDBMS platforms beyond Oracle and a version running under Windows NT starting later this year.

Bentley Expands Geoengineering Line With New Tools

Last month Bentley announced three new geoengineering tools, MicroStation GeoOutlook, MicroStation GeoExchange, and MicroStation GeoCoordinator. These new products represent a continued expansion of the company’s geoengineering product line that increases the value of geoengineering data and boost the productivity of geoengineering activities.

MicroStation GeoExchange — Is a translator that converts geoengineering data between MicroStation GeoGraphics and other industry standard formats, including ESRI and MapInfo. The comprehensive product processes both spatial and attribute information. Because the product is fully configurable, users can provide site-specific rules for more customized translation. Using MicroStation GeoExchange, MicroStation-based projects can employ geoengineering data from virtually any source and can create data for use by virtually any system.

MicroStation GeoOutlook — Is a low-cost, easy-to-use desktop and mobile tool for accessing and analyzing geoengineering data. It is intended for planners, managers, and field-based workers who participate in geoengineering projects but don’t need the data creation abilities or full power of MicroStation GeoGraphics. MicroStation GeoOutlook offers map data querying, basic spatial analysis, graphical presentation, feature management, and reporting. Users can simultaneously view vector and raster data. Also, the tool connects to a variety of desktop or enterprise databases, including Oracle, Informix, and Microsoft Access.

MicroStation GeoCoordinator — Is a projection management system designed exclusively for MicroStation and is based on technology from Mizar Systems Limited (Vancouver, BC). It assigns Coordinate Systems to individual maps, acknowledges coordinates from any assigned projection system, and converts MicroStation data from one Coordinate System to another. In addition, users can generate map grids based on any supported Coordinate System. The product includes a complete library of worldwide projection systems from Mentor Systems, Inc. (Thornton, CO).

All three new geoengineering tools are MDL-based. Each serves as both an end-user tool and a platform for other applications and customizations. MicroStation GeoExchange and MicroStation GeoCoordinator are both priced at US$1,495. MicroStation GeoOutlook is priced at US$995. Special pricing is available for Bentley SELECT subscribers. MicroStation GeoCoordinator is available now as a released product and the other tools are available as beta products.

The Open Engineering Connectivity Specification

To encourage full access to all project services available in the Engineering Back Office from any client program, Bentley will publish a set of API's for each of the ModelServer products. These API's, collectively referred to as the Open Engineering Connectivity (OEC) specification, allow users and third parties to customize, extend, and create new clients of ModelServers. These APIs are network-based and language-independent, allowing client development using various programming environments including Visual Basic, Java, C, C++, ActiveX, etc. For example, the OEC specification for ModelServer Publisher includes all necessary information for creating a client-side application to embed views of any MicroStation DGN or AutoCAD DWG file inside display forms on an HTML page or in a Visual Basic application. When combined with ModelServer TeamMate, these same applications can perform user authorization, check drawing status and location and perform workflow validation.

GET font Solutions<>

Last year, Bentley formed a new affiliation with a company called NetSpace Systems Inc. (Huntsville, AL) that will offer technologies for design and management of facilities networks. The products emerging from NetSpace fully integrate network engineering data with existing corporate information technology (IT) architectures for the gas and electric utilities and telecom industries. NetSpace is a wholly-owned, but independent Strategic Affiliate of Bentley Systems. As a Bentley Strategic Affiliate, NetSpace is well positioned to address the worldwide utilities industry, where Bentley’s MicroStation is already widely used. Geoengineering is the enabling technology for NetSpace.

In March, at the AM/FM International Conference, NetSpace launched ESpace and GSpace, electric and gas automated mapping/facilities management/ geographic information systems (AM/FM/GIS) products for utilities companies. Both products provide out-of-the-box electric or gas utility facility management systems based on MicroStation GeoGraphics and MicroStation 95 running under Windows NT and 95.

ESpace and GSpace are database-driven applications. Symbology and facility/feature rules are defined in industry-standard relational databases, allowing utilities to easily and quickly configure company standards and rules for symbology, network connectivity, attribute definition, attribute constraints and defaults, value lists, and other functionality. This unique approach lets users generate of graphical user interface (GUI) menus on the fly. Both products are delivered with standard models for electric and gas facility networks. The models are supported by fully functional tools for placement, editing, tracing, and analysis. Standard map and engineering work order generation are integral to the applications. ESpace also provides for interfacing to industry-standard network analysis packages such as ABB’s FeederAll.

These NetSpace products represent a significant departure from the approach taken by many existing AM/FM/GIS systems, which traditionally have required pre-definition of the GUI and significant setup time to configure the rules basis for the application.

Both products include an intelligent land base model and associated capture and maintenance tools. The products provide a street centerline model suitable for dispatching applications, along with addressing and service locations that are fully integrated with the gas and electric facility models. By extending the power of MicroStation GeoGraphics with its product offerings, NetSpace offers sophisticated network facility model management and spatial GIS capabilities in a single package. Users can take advantage of the extensive capabilities of MicroStation GeoGraphics, including thematic mapping, spatial analysis, and reporting capabilities.

ESpace and GSpace can be further customized through the Open API provided by NetSpace or through customization and consulting services offered by NetSpace or its service providers. The Open API is the core of the system, providing for customization through Visual Basic, MDL, Visual C++, or C programming languages.

ESpace and GSpace employ MicroStation GeoGraphic’s map management capabilities for data management and are upwardly compatible with ModelServer Continuum for customers who want to take advantage of its ability to manage long transactions. ModelServer Continuum manages the engineering information transactions between a corporate RDBMS warehouse and users, providing seamless online facility management and network analysis and operations. Utilizing Bentley’s Engineering Back Office three-tier client/server architecture, ModelServer Continuum and NetSpace technologies establish collaboration between engineering and the IT infrastructure.

"Today, utilities companies demand better integration of existing engineering, Geographic Information Systems (GIS), and corporate information - with easy access," said Andrew Coe, founding president of NetSpace. "NetSpace’s mission is to bring these islands of engineering automation into the corporate mainstream, making engineering information widely available via corporate intranets and the Internet using common browsers."

Greg Bentley, president of Bentley Systems, said, "Of all the industries involved in geoengineering, the utilities sector — where MicroStation is used by over 25,000 users worldwide — arguably has the most critical need for solutions that break barriers between GIS, engineering, and corporate IT data stores."

Russell S. Kauffman, PLS, principal GIS engineer with the Public Service Electric & Gas Company, said, "Utilities are facing a confusing and uncertain future as they transition into a deregulated environment. As a result, the worlds of computer-aided design, GIS, AM/FM, and mapping must unite to address core competencies and processes."

Summing It All Up

Bentley’s geoengineering products accommodate an ever-greater multitude of applications, platforms, operating systems, development tools, and databases. Bentley’s products and services are also appealing to a much bigger community of both geoengineering users and developers, bringing them into the bigger enterprise information technology fold.

It would be easy for Bentley to sit back and rest on its laurels since its products are truly productive and "open" when compared to those of many of its competitors. We feel that a major portion of the company’s focus today should be to more aggressively encourage the user community to implement its currently available technologies and products.

Bentley stands out as a vendor by perceiving that there are needs facing its customers that cannot be met with the traditional software products being sold today by most other vendors. In particular, the current crop of Bentley geoengineering packages should be more heavily aimed at contributing to both an individual practitioner’s or a larger group’s collaborative productivity.

Until recently, concurrent and collaborative engineering techniques were largely confined to manufacturing environments where processes were vastly improved by bringing together diverse team groups to share ideas, information and the resulting successes. As we approach the next millennium, we feel that geoengineering will enjoy comparable results — and Bentley Systems will be benefit from its significant contributions to the field of geoengineering.