Professional Documents
Culture Documents
ALL ABOUT DELPHI AND DELPHI PRISM(.Net) , LAZARUS & PASCAL AND RELATED LANGUAGES
Pascal
95/13
LAZARUS
ALL ACCES
COMPONENTS
DEVELOPERS
4
DELPHI
DB ARTISAN ER/STUDIO
RAPID SQL
DB CHANGE MANAGER
September 2010
BLAISE PASCAL MAGAZINE 11 ALL ABOUT DELPHI AND DELPHI PRISM(.Net) ,LAZARUS & PASCAL AND RELATED LANGUAGES
CONTENTS
Articles
Mini Course SQL by Miguel van de Laar page 10 Is Lazarus ready creating commercial applications ? by Zeljan page 13 Delphi JSON Viewer by Pawe Gowacki page 16 NexusDB exceptionaly good, a real surprise... by Erwin Mouthaan page 22 T NE EA The TMS DataModeler by Bruno Fierens page 27 Introduction to Delphi Database Development: Part 1 by Cary Jensen page 31 First Look at Advantage Database Server 10 by Cary Jensen page 35 T NE EA Real Time data collection by Anton Vogelaar page 39 Object oriented databases by Detlef Overbeek page 45 Fastreport, whats up? by Marcop Roessen, Rob van den Bogert and Detlef Overbeek Page 52 Using ADS with Delphi Prism and ASP.NET by Bob Swart page 59 A datawarehouse example using kbmMW by Kim Madsen page 66 Multiplatform Windows CE by Joost van der Sluis page 73 Five Considerations for Choosing an Effective SQL Development Tool by Scott Walz page 81
GR
GR
EA
T NE
GR
GR
EA
W
S
T NE
Editors
Rob van den Bogert, W. (Wim) van Ingen Schenau, Miguel van der Laar, M.J. (Marco) Roessen . Correctors Howard Page-Clark, James D. Duff Translations M. L. E. J.M. (Miguel) van de Laar, Kenneth Cox (Official Translator) Copyright See the notice at the bottom of this page. Trademarks All trademarks used are acknowledged as the property of their respective owners. Caveat Whilst we endeavour to ensure that what is published in the magazine is correct, we cannot accept responsibility for any errors or omissions. If you notice something which may be incorrect, please contact the Editor and we will publish a correction where relevant.
W
S
Columns
Editorial page 4 Bookreviews by Frans Doove page 5
Advertisers
Advantage Databases page 3 Barnsten page 83 Cary Jensen Book Advantage Database Server page 12 Components 4 Developers page 84
Subscriptions (prices have changed) 1: Printed version: subscription 60.-(including code, programs and printed magazine, 4 issues per year including postage). 2: Non printed subscription 35.-(including code, programs and download magazine) Subscriptions can be taken out online at www.blaisepascal.eu or by written order, or by sending an email to office@blaisepascal.eu Subscriptions can start at any date. All issues published in the calendar year of the subscription will be sent as well.
Subscriptions run per calender year. Subscriptions will not be prolonged without notice. Receipt of payment will be sent by email. Invoices will be sent with the March issue. Subscription can be paid by sending the payment to: ABN AMRO Bank Account no. 44 19 60 863 or by credit card: Paypal or TakeTwo Foundation for Supporting the Pascal Programming Language (Stichting Ondersteuning Programeertaal Pascal) IBAN: NL82 ABNA 0441960863 BIC ABNANL2A VAT no.: 81 42 54 147 (Stichting Programmeertaal Pascal) Subscription department Edelstenenbaan 21 3402 XA IJsselstein, The Netherlands Tel.: + 31 (0) 30 68.76.981 / Mobile: + 31 (0) 6 21.23.62.68 office@blaisepascal.eu
All material published in Blaise Pascal is copyright SOPP Stichting Ondersteuning Programeertaal Pascal unless otherwise noted and may not be copied, distributed or republished without written permission. Authors agree that code associated with their articles will be made available to subscribers after publication by placing it on the website of the PGG for download, and that articles and code will be placed on distributable data storage media. Use of program listings by subscribers for research and study purposes is allowed, but not for commercial purposes. Commercial use of program listings and code is prohibited without the written permission of the author.
Page 2
COMPONENTS
DEVELOPERS
Page 4
COMPONENTS
DEVELOPERS
Graphical User Interface programs for Windows 32 and 64 Windows CE, Mac OS, Unix and Linnux
Publisher: ProPascal Foundation - Netherlands (St. Propas) ISBN: 978-94-90968-02-1 This is a preview of the contents of the book that will be published in English in December 2010. Length: about 800 pages Price: 50 Euros The book contains 12 chapters and a large index of the main concepts. is written by a group of specialist authors. Each author contributes one or more chapters according to his special expertise. Chapter 1: "The architecture of Lazarus" introduces basic concepts. Chapter 2 deals with the installation of Lazarus. Chapter 3 covers the integrated development environment (IDE). This long chapter (118 pages!) is a detailed and elaborate explanation of the potential in the many tools included in the IDE.. The IDE is not discussed in overview, rather the chapter explains what the possibilities are and what you can arrange with it. This long chapter (118 pages!) covers each possibility in a repetitive style. After this elaborate discussion of the technical aspects, the software description starts. Chapter 4 handles Projects. Unfortunately too much reader knowledge is assumed when some topics are introduced. The chapter begins by mentioning "GUI Applications" but these are not explained until later. The explanation when it comes is exhaustive. Chapter 5 entitled "Target Platforms" documents, the recompiling of Lazarus programs for Windows, Linux/Unix and Mac-OS.X.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
The book will be available in mid December 2010. You can order it at our web shop direct. If you pre-order the LAZARUS COMPLETE GUIDE you can also order a discounted Lazarus USB stick for only 15.00
COMPONENTS
DEVELOPERS
Page 5
Partial Preview of chapter 4: PROJECTS by Felipe Monteiro de Carvalho Lazarus is an Integrated Development Environment geared towards the development of Pascal applications. While its greatest strength is in the development of GUI applications, Lazarus can be used to develop all kinds of projects, including web applications and even programs without a user interface.
Downloading from the subversion repository The installation of TortoiseSVN Subversion program package SVN-repositories for Lazarus and FPC Basic use of the Subversion command line Checking out with TortoiseSVN Installation on Windows Installation on Linux Installation on FreeBSD Installation on MacOSX Chapter 3 - The IDE
by Swen Heinig
The main Lazarus menu Databases Edit Searching View Project Compiler directives Start Package Tools Environment Windows Help Object Inspector Source editor Source code completion Message Composer Debugger Recompiling the IDE Chapter 4 - Projects by Felipe Monteiro de Carvalho GUI applications Console appplications DLLs and Shared Objects Dynamic Libraries Libraries in MacOSX Control panel applets for Windows CGI Applications CGI Programs in Pascal CGI with Powtils Unit Testing Packages Installing Components Registering components Property editors Component editors Services and Daemons
Image 4. : The Dialog File - New The Lazarus menu selection File -> New displays a dialog with a list of possible Project Types which Lazarus can create. The section headed Projects shows how many different kinds of projects can be developed with Lazarus and it should be noted that the list is not exhaustive. As a general software development tool, any kind of programming project can be developed with Lazarus. The list present in this dialog displays only the project types for which Lazarus has built-in templates, and external packages can extend this list. After selecting the menu item New in the File menu, a dialog appears showing a number of possible projects and modules to choose from. The contents of this dialog change from version to version and also when a new package is installed which uses the so-called Tools API to install new entries in this dialog. Using this interface other IDE dialogs like Environment and Project Options can also be altered. Even without extra packages, the standard Lazarus dialog already contains alarge number of possibilities to choose from, and those standard options are explained in Table 4.1 (which is split across two pages here).
Table 4.1 The standard options for creating new modules and projects in Lazarus (Part 1 from 2)
Page 6
COMPONENTS
DEVELOPERS
PartialPreview of chapter 4: PROJECTS To ensure that texts can be used in any language, using a GUI library with Unicode support is strongly recommended. The table below compares various GUI libraries in existence today which can be utilized to write Pascal applications. Note that only Unicode-enabled libraries are shown. Free Pascal applications can be written using all the libraries shown (except the Delphi VCL), but only with the Lazarus LCL and with the KOL-CE library is it possible to use the Lazarus form designer, object inspector and standard component Also note that additional components installed on the component palette can only be used with LCL applications. This book will obviously only cover working with the Lazarus Component Library (LCL).
Table 4.2 : GUI libraries and Pascal as of September 2010 To start developing a new GUI application based on the LCL choose the menu File New and select the option Application in the dialog which appears. A new main form will be created with the name Form1 and it will be opened and ready to be edited usingthe form designer.
Table 4.1 : The standard options for creating new modules and projects in Lazarus (Part 2 from 2)
In this chapter I will explain only the most important types of project which you can develop with Lazarus. For most of them a specific template can be selected in the menu File New, but for some a generic template is used.
Image 4.3 : Starting a new GUI application in the dialog File New As soon as this option is selected all the necessary files for this kind of project are created and the form designer as well as the code editor windows will be opened. More windows can be added to the project by choosing the menu File New Form and new Pascal source code units can be added with the menu File New Unit. The same action can also be performed byopening the dialog File New and selecting the corresponding module or by clicking on the appropriate button in the Lazarus toolbar in the left corner of the main Lazarus window. While Lazarus offers project management and very advanced code completion for all kinds of projects, even console ones, it is for GUI applications that it really distinguishes itself from other IDEs. The code in the unit is automatically updated for each new component dropped on a form, as well as if you change a component's name. so that the code is automatically synchronized with the GUI editor. By double clicking an event the code editor is shown with the cursor placed in the procedure which handles it, and a suitable procedure is added to the code if none is assigned to the event. Even the end corresponding to a typed in begin is automatically added, although excessive automatic coding from the IDE can get annoying so it can be disabled via the Options menu.
COMPONENTS
DEVELOPERS
Image 4.2 : The dialog Project - New Project offers the same project creation options as the dialog File New 4.1 GUI APPLICATIONS Ever since its commercial introduction in the 1980s, the graphical user interface (GUI) has quickly grown in popularity, and today only technically savvy users work with console interfaces. The secret in writing a cross-platform GUI is simply using a cross-platform GUI library which ensures the portability of the displayed images and text.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
Page 7
Operating System specifics The API of Windows32/64 The Special case Windows CE Linux, FreeBSD and other Unix platforms The APIs of MacOSX 32 Bit and 64 Bit Configuration files Resource files Chapter 6 - The class libraries
by Michael Van Canneyt
TCP/IP programming The Client program The Server program Web services Programming Servers Programming Clients Message Logging Object pooling Service extensions Chapter 12 - Database Access
by MichaelVan Canneyt
File dialogs in Lazarus Working with files Searching in directories Communicating with devices The parallel port Serial communication The printer Chapter 9 - Graphics programming
by Felipe Monteiro de Carvalho
The drawing canvas(?) Colurs TPen TBrush Fonts Important graphical routines Graphical components Varieties of graphic objects TGraphic TRasterlmage Bitmaps TJpeglmage Icons Chapter 10 - Processes and Threads
by Felipe Monteiro de Carvalho
Processes Threads
Page 8
COMPONENTS
DEVELOPERS
The RTL (FPC Run Time Library loading and saving ) Loading and saving using streaming Component naming The FCL (Free Component Library) The LCL (Lazarus Component Library) The Application class Screen windows Working with TForm The properties of TForm Special windows The window environment Controls in a window Layout and Program design Actions Drag and Drop The elements of the component palette The Standard Tab The Additional Tab The Common Controls Tab The Dialogs Tab The Misc Tab The Data Controls Tab The Data Access Tab The System Tab The SynEdit Tab Chapter 7 Porting Delphi components by Michael Van Canneyt and Mattias Grtner The architecture of Lazarus components Operating System independent layer Operating System dependent layer Component modelling Porting in real life From components to Lazarus Package The component palette Chapter 8 - Files and devices
Architectural overview Database acces Choosing Databases The Database Desktop: an additional tool Classes for Database access The Dataset The DataModule Data-aware controls TDataset descendants The Database Desktop The Data Dictionary Exporting Data Generating code Crashcourse SQL Reports Creating reports The Report-Designer Index Index of graphicx Index of tables
Graphical User Interface programs for Windows 32 and 64 Windows CE, Mac OS, Unix and Linnux
ISBN 978 3 936546 54 5 Paperback, 398 pages, including CD. Price : 50,00 The book is written in German. Morfik is an Australian software company, as you can see in the book title. You can find a huge amount of information about Morfik on the internet. Here is a brief summary: Morfik redefines web development by combining graphical design and visual programming in a single environment, dramatically reducing the time required to build modern web applications. It gives you the power of a full featured SQL relational database, but with a user-friendly visual interface. You can use the visual designer to make designing tables and queries for your internal relational database simple and easy. Morfik allows you to write your web-based application using any combination of the supported languages C#, Basic or Pascal) for both browser and server sides, using the Morfik Compilers rigorous enforcement of referential integrity to produce scalable Ajax applications. The compiler implements automatic intelligent optimization as well as automatic obfuscation/compression of the final JavaScript. It is a complete application hosting and deployment platform, automating the process of deployment as a part of the development environment itself. Deployment is handled via a deployment Server and a deployment Client which is built into the environment. Web solutions created with Morfik can be readily accessed by search engines. Application content can be easily published (with a clean URL) which gives your web application an advantage with search engines, bookmarking and devices that do not support JavaScript. End users can also browse through Morfik Ajax applications with JavaScript turned off and they have a very similar browsing experience. Morfik automatically generates all the necessary browser code from your selected language. The generated codes are industry standards compliant and therefore are compatible with all major browsers. Morfik extends the power of user-defined themes beyond Pages, and Layouts to theme controls themselves. All Morfik controls support themes directly and give the user unprecedented ease in customizing the look and feel of Morfik applications and websites without interfering with the application logic. It provides a complete set of wizards to assist you with the creation of new project elements such as Tables, Queries, Forms, Reports, Web Methods and Modules. Wizards also help you with deploying your application, linking or importing external data sources, consuming Web Services, converting projects and more.
Code completion, as well as MorfikDoc popup help is available when you are coding. You can navigate through your code using hyperlinks. The code editor has full information about your code at all times and can help you navigate to where a class, method, type or variable is declared. Morfik supports debug errors browser and server side code from within Morfik by stepping over and through your high-level source code as well as automatically generated JavaScript code. You can add breakpoints, to pause execution anywhere in the browser or server side code. You can view the current value of a variable by hovering the mouse pointer over it. Track the flow of execution in your code and more. Morfik allows developers to both consume and publish Web Services through easy to use Wizards. This offers an extremely easy path to the world of Web Services and Services Oriented Architecture (SOA). All server side components of Morfik Applications (XApps) are inherently SOA compliant servers. Morfik Packages and Widgets allow to create fully functional solutions and advanced controls that can plug into any Morfik project. A rich set of pre-built Packages are also available from Morfik. You can create dynamic data-driven PDF reports that are naturally suitable for printing and distribution. The reports design process is identical to designing forms and it utilizes all the properties that you have come to expect from word processing.
Conclusion: This is a very interesting web development tool, and well worth your while to try out. You can find the latest Morfik version included in our database_special.iso available from the Blaise website. If you want the database_special.iso a DVD you will have to order it at the online Blaise shop.
COMPONENTS
DEVELOPERS
I.CustomerId
ORDER BY C.LastName
The SQL statement above asks the customer name and the amounts due on all invoices which are not yet paid, but where the due date has passed. This example contains a SELECT, FROM, WHERE, and ORDER BY clause. In this example the SELECT clause selects three fields: LastName, FirstName and TotalAmount. TotalAmount gets an alias here: Amount. And so in the result of the SELECT statement the last field will be named Amount, not TotalAmount.. This can be useful when you need the result directly in a component where the names of the fields are used as column names. The ORDER BY-clause indicates how the result must be sorted. If the clause is not present, the order will be arbitrary. You can also designate several fields here, separated by a comma. The default sort order is ascending, but by using the codeword DESC the sorting will be descending. E.g. ORDER BY C.Amount DESC places the highest TotalAmount at the top. The FROM clause specifies that the table Customer can be accessed with the alias C and the table Invoice with alias I. The tables receive an alias here as shorthand notation you can use when the tables have to be distinguished. When the table Customer contains the field LastName and table Invoice does not, you can use LastName in a SELECT clause unambiguously. But when both tables have a field with the same name, for instance CreationDate, the database doesn't know which field is meant. Then you must indicate whether you mean the Customer or the Invoice table, by using either one of the aliases C.CreationDate or I.CreationDate. When there are several tables in the FROM-clause, this will result in a so-called Cartesian product of those tables. If there is no WHEREclause, the number of records in the result is equal to the product of all records of all tables (for instance for 3 tables with 5 records each the result would contain 5 x 5 x 5 = 125 records). The records in the result then contain all possible combinations of the records in those tables. As we are only interested in combinations of Customer and Invoice in this example, where customer and invoice belong together, the condition C.Id = I.CustomerId is included. In principle, a WHEREclause should contain such a coupling statement for each table after the first one. The WHERE-clause indicates when records must be included in the result. For this purpose you use boolean statements to specify the conditions for a record's inclusion. In the case of the SELECT statement above, a record is included in the result when an invoice belonging to a certain customer is not yet paid whilst the due date has been passed. The function Now() used here returns the current day and time.
INTERBASE
- Standard Query Language SQL is a language which enables querying or updating relational databases such as IBM DB2, Firebird, Ingres, Microsoft Access, Microsoft SQL Server, MySQL, Oracle, PostgreSQL and SQLite. A relational database is a database system which stores a variety of relations between its data. These relations are for instance in the form of fields, records, tables, and links to other fields, tables, records and databases. Although SQL is formally a standard language, the implementations of SQL in the various commercially available databases can differ considerably. This is due to the fact that vendors often developed own extensions to the standard SQL. To meet the ANSI standard (ANSI is the American National Standards Institute, an organisation which manages a number of American standards), however, at least the basic commands (as discussed in this article) must be implemented. Other commands (for example to construct databases and tables or to manage users) are entirely dependent upon the specific database or vendor. This article deals with the main basic SQL commands which are:
SQL: SELECT, INSERT, UPDATE, and DELETE.
SQL
These are the SQL commands most commonly used in program code. The other SQL commands are most commonly used in database maintenance. Most databases offer GUI tools for such maintenance tasks, and these tools employ these other commands. For further information about these additional SQL commands you will need to consult your database manual or make use of the appropriate internet forums. Commands in SQL are called statements. SELECT The SQL statement which is most often used is probably SELECT. After all this is the statement to questions a database and that's the reason why it is called a Query language. The SELECT statement typically looks like this:
SELECT * FROM Customer
Page 10
COMPONENTS
DEVELOPERS
(FirstName,
LastName,
To conclude You can do much more using the basic SQL commands than what I have been able to cover here in an introductory article. If there is sufficient interest, I will examine more complex SQL statements in a subsequent article. If there are questions about SQL don't hesitate to email me (mvdlaar@gmail.com).
. The INSERT INTO-clause contains the name of the table into which the record must be inserted and (between parentheses) the fields which will be filled with values. Fields which are not mentioned will get a default value (usually NULL, i.e. empty). Don't include auto-incremented fields (or Id fields) in the list of fields, since they will get new values inserted automatically. The VALUES clause which follows INSERT INTO contains the values for the fields, in the same sequence as given in the INSERT INTO-clause. Dependent upon the database you must enter textual values either with quotation marks ( ) or apostrophes ( ' ). Note that the number of fields must be equal to the number of values. UPDATE The UPDATE statement is meant to change values of records, as shown below:
UPDATE Customer SET Address = "Steenweg", HouseNumber City
= 12,
= "Naarden"
WHERE Id
= 12
The UPDATE statement above contains all clauses which may be present in an UPDATE statement: the mandatory UPDATE- and SET-clauses and the optional WHERE-clause. As you will notice, the UPDATE statement has a slightly different layout than the INSERT statement. The UPDATE-clause contains only the name of the table in which one or more records must be changed. The SET-clause then has in a comma-separated list, the fields which must be changed, each consisting of the name of the field, an equals sign ( = ) and the value which must replace the old value. The WHERE-clause functions the same as in the SELECT statement: the parameter of the WHERE-clause is a boolean statement. The records for which this Boolean is True will be changed. Take care: if no WHERE-clause is given, ALL records will be changed! DELETE Our survey of basic SQL commands is complete with DELETE.. When designing a new database it is important to know the difference between logically or physically deleting records from a table. When you delete records only logically, they remain physically present in the database, but marked with a 'removed' status. Take care that the logically deleted record doesn't show up in surveys (simply add in all SELECT statements a condition to exclude removed or inactive records). Logical removal is particularly recommended in situations where inexperienced users could delete enormous amounts of data or if logging takes place. If you choose to physically delete a record, the database will no longer hold the deleted data anywhere. In SQL this is done by the DELETE statement:
DELETE FROM Customer WHERE FirstName = "Peter" AND LastName
= "Halsema
The DELETE statement has only two clauses: the mandatory DELETE FROM-clause and an optional WHERE-clause. DELETE FROM is followed by the name of the table from which one or more records are to be deleted. The operation of the WHERE-clause is the same as for the SELECT and UPDATE statements. Take care: if no WHERE-clause is given, you'll end up with an empty table!
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
Page 11
I started my job in 2001 at a Croatian company called Holobit.. At that time Holobit was a pretty small company with a few dozen customers. My primary task was to make the company's current C++ business applications run on Linux and Windows by using Delphi and Kylix and Borland's CLX technology. After years of C/C++ coding on linux OOP looked very simple and well organized. Two months later I concluded that Borland had great products, and the coding time is much shorter than it was with C/C++ (gtk+,qt) using vi editor. Anyway, within 3 months our business applications were converted to CLX and the company started selling for Linux & Win32. All was done with Kylix 2 and Delphi 6 (later upgraded to K3 & D7). Creating native Linux apps was a good decision, so the number of customers started growing rapidly. Our customers were happy with the ability to choose between Linux and Windows client apps for dekstop PCs, because it saves some money and creates a better and more secure environment. The second conversion issue that looked rather complicated - at that time - were databases. When I started to convert our applications, all of them used Foxpro, and I was very disapointed with it, because I already used Postgresql on Linux. So your guess that we moved all of our applications to Postgresql is quite correct.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
I had looked into Lazarus just a few times before, but I was not attracted previously, because it supported only the Gtk1 widgetset which looked awful compared to the Qt2 used by Kylix, so now I got motivated to download the lazarus trunk and find out to see the way how it worked with Qt. (I already tried Gtk before). Well, as IG mentioned already, work on the Qt widgetset having only just started, the result needed a lot of improvements, so it does not work. After a quick scan of Lazarus principles, the Lazarus component library (LCL) and widgetset connections to the LCL I began to contribute to the Lazarus project with a primary goal to get Qt widgetset up and running. My first patches then were sent to Felipe. He argued about my coding standards (hey, hey), but I fixed that and changed my coding standards to the Lazarus coding standard. Anyway, after a year or so the Qt widgetset became useable - in the meantime Lazarus developers granted me svn write access - so no more need to wait for Felipe and others to commit my patches. At the same time business problems arose with Kylix & Delphi programming and the company management considered moving our thought about to move the complete codebase to Java or .Net . When the management decided this change must be made quickly I objected.
COMPONENTS
DEVELOPERS
Page 13
Page 14
COMPONENTS
DEVELOPERS
Now our complete range of software is developed using FPC/Lazarus and uses PostgreSQL RDBMS: 1. HoloERP ERP system with > 400 modules (forms) 2. Cafeman Caffe bars & restaurants backoffice and POS system 3. TSuS small shops backoffice & POS system 4. Cinema software for cinemas (reservations, tickets etc) 5. ArhStudio architects documentation database. All of those these applications use the following 3rd party components: ZeosLib FastReports (ported CLX) TMS Grids (ported CLX, but also we licensed the newest VCL and ported it also) TMS Planner (ported CLX, later VCL) FlexCell (licensed LCL , yes there's LCL version) Our custom components
Conclusion: Lazarus is ready for commercial usage especially for people with legacy Kylix3 / Delphi7 codebases. My personal opinion is that Lazarus Qt is much better than K3/D7 at this time (0.9.29 trunk), and developers will be happy with it's new 0.9.30 version. Why? The only OOP RAD which supports so many platforms. Constantly developed by volunteers, it does not depend on commercial decisions so you can avoid bankrupcy etc. Costs almost nothing except energy and time. If it doesn't fit your needs, you can change it and contribute. If there's a bug - you can fix it and contribute it, but at least you can open an issue at lazarus mantis issue tracker.
COMPONENTS
DEVELOPERS
Page 15
starter
Why JSon?
Delphi and DBXJSon.pas Many programming languages have built-in support for JSON or JSON is relatively new as it was first described by Douglas Crockford in libraries to work with JSON. These JSON bindings for different programming languages are listed on JSON homepage [4], including July 2006 in his IETF Request for Comments The application/json three open source Delphi libraries. Since Delphi 2010 the JSON Media Type for JavaScript Object Notation[2]. support is part of the VCL library as implemented in the In many respects JSON is similar to XML as both are text based data DBXJSON.pas unit. interchange formats widely used across in the Web. While XML has In order to visualize Delphi classes responsible for JSON support, I now become a whole family of related standards - including XML added the DBXJSON unit directly to a little test Delphi application, Namespaces, XML Schema, XSL, XPath and others - JSON defines clicked on the Model Support tab in the Project Manager and got the only a small set of formatting rules for the portable representation of following UML class diagram. Some of the classes from the structured data. The key strength of JSON its simplicity. Douglas Crockford describes JSON structure in his paper presented at the XML DBXJSON.pas unit not related directly to JSON support are not 2006 Conference in Boston JSON: The Fat-Free Alternative to XML shown here. [3]: The types represented in JSON are strings, numbers, booleans, object, arrays, and null. JSON syntax is nicely expressed in railroad diagrams.
JSON only has three simple types strings, numbers and Booleans and two complex types arrays and objects. A string is a sequence of zero or more characters wrapped in quotes with backslash escapement, the same notation used in most programming languages. A number can be represented as integer, real, or floating point. JSON does not support octal or hex. It does not have values for NaN or Infinity. Numbers are not quoted. A JSON object is an unordered collection of key/value pairs. The keys are strings and the values are any of the JSON types. A colon separates the keys from the values, and a comma separates the pairs. The whole thing is wrapped in curly braces. A JSON array is an ordered collection of values separated by commas and enclosed in square brackets. The character encoding of JSON text is always Unicode. UTF-8 is the only encoding that makes sense on the wire, but UTF-16 and UTF-32 are also permitted. JSON has no version number. No revisions to the JSON grammar are anticipated.
In Delphi 2007 the DBX database driver framework architecture has been re-engineered in pure Delphi code and this introduced a number of interesting features including extensible command types. In the release that followed - Delphi 2009 - the DBX architecture has been extended and DataSnap framework for building client/server and multi-tier applications has been reengineered as well as the extension of the new DBX architecture.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
Page 16
COMPONENTS
DEVELOPERS
The DBXJSON unit also contains functionality to parse JSON text into the graph of TJSONValue-descedants and to generate JSON text from the graph of objects in memory. The "TJSONAncestor.Owned" property (a boolean value) has been expanded to underline the fact that all JSON descendants have the Owned property that controls the lifetime of JSON objects in memory. The TJSONObject class contains a static method ParseJSONValue that effectively implements JSON parser functionality. It accepts a string parameter with JSON text and returns a TJSONValue reference to the root of the graph of TJSONAncestordescendants. It is also possible to generate JSON text from the in-memory tree of JSON objects calling the overloaded "ToString" method on any of TJSONAncestor descendants. TJsonDocumentComponent Before Delphi 2010 introduced the DBXJSON unit, I was trying to implement JSON parsing functionality manually by coding JSON railroad diagrams. With the DBXJSON implementation in place there is little point in reinventing the wheel; however there is still no design -time support for JSON. Everything has to be done in code. Hence the idea of creating a minimal VCL component wrapper for JSON parser implementation provided by a TJSONObject.ParseJSONValue class method that accepts JSON text and returns the object tree representing the corresponding JSON document structure in memory. The TJSONDocument component has been implemented inside a unit named "jsondoc" to mirror the "xmldoc" name of the unit containing the implementation of TXMLDocument class. Below is the declaration of the TJSONDocument VCL component:
unit jsondoc; // type TJSONDocument = class(TComponent) private FRootValue: TJSONValue; FJsonText: string; FOnChange: TNotifyEvent; procedure SetJsonText(const Value: string); procedure SetRootValue(const Value: TJSONValue); protected procedure FreeRootValue; procedure DoOnChange; virtual; public class function IsSimpleJsonValue(v: TJSONValue): boolean; inline; class function UnQuote(s: string): string; inline; class function StripNonJson(s: string): string; inline; constructor Create(AOwner: TComponent); override; destructor Destroy; override; function ProcessJsonText: boolean; function IsActive: boolean; function EstimatedByteSize: integer; property RootValue: TJSONValue read FRootValue write SetRootValue; published property JsonText: string read FJsonText write SetJsonText; property OnChange: TNotifyEvent read FOnChange write FOnChange; end;
The full source code of this component and all other source code described in this paper can be downloaded from [1]. See the References section at the end of this article. The TJSONDocument class contains a published JsonText: string property that can be used to assign JSON text for parsing and a RootValue: TJSONValue public property that can be used to assign a TJSONValue reference and generate JSON text. Assigning to either of these properties causes the other property to be updated and the OnChange event is fired every time the JSON stored inside the component is changed. In this way it is possible for other components of an application to be notified and refreshed. In this sense the TJSONDocument component can be used as a JSON parser and generator as described in the original JSON RFC [2]. The public IsActive: boolean property returns true if TJSONDocument component contains valid JSON, or false it is empty.
function TJSONDocument.IsActive: boolean; begin Result := RootValue <> nil; end;
The TJSONObject.ParseJSONValue: TJSONValue method is sensitive to the contents of the JSON text passed for parsing. If the string provided does not contain valid JSON text or it contains JSON text with additional whitespace characters, then it always returns a nil TJSONValue reference. The class function StripNonJson is used to remove from JSON text any non JSON characters and is implemented as follows using the TCharacter class from the VCL Character unit.
class function TJSONDocument.StripNonJson(s: string): string; var ch: char; inString: boolean; begin Result := ''; inString := false; for ch in s do begin if ch = '"' then inString := not inString; if TCharacter.IsWhiteSpace(ch) and not inString then continue; Result := Result + ch; end; end;
The process of JSON parsing is implemented in the ProcessJsonText method that is called as a side-effect of assigning to JsonText: string published property.
procedure TJSONDocument.SetJsonText(const Value: string); begin if FJsonText <> Value then begin FreeRootValue; FJsonText := Value; if FJsonText <> '' then ProcessJsonText end; end; function TJSONDocument.ProcessJsonText: boolean; var s: string; begin FreeRootValue; s := StripNonJson(JsonText); FRootValue := TJSONObject.ParseJSONValue(BytesOf(s),0); Result := IsActive; DoOnChange; end;
The TJSONDocument was designed to be as minimal as possible. For convenience it also surfaces the EstimatedByteSize: integer method provided by the underlying DBXJSON implementation. This is how the TJSONDocument component looks at design-time inside the Delphi 2010 Object Inspector.
COMPONENTS
DEVELOPERS
Page 17
Here we go I have decided to create my JSON tree view component as a descendant of the Delphi VCL TTreeView component. A good Delphi programming practice would be to derive it from TCustomTreeView instead to be able to decide which inherited protected members of a class should be declared as published. In my case I want the end user to have access to whole TTreeView component functionality at design-time, so I do not need to hide any inherited properties.
unit jsontreeview; type TJSONTreeView = class(TTreeView) public procedure LoadJson; published property JSONDocument: TJSONDocument // property VisibleChildrenCounts: Boolean // property VisibleByteSizes: Boolean // end;
Page 18
COMPONENTS
DEVELOPERS
TJsonParser Component In a sense the TJSONDocument component can be considered the implementation of a Document Object Model for JSON. But what about SAX for JSON? SAX or Simple API for XML presents a completely different approach to document parsing. Instead of building an in-memory representation of the document, it just goes through it and fires events for every syntactical element encountered[7]. It is up to the application to process the events it is interested in. For example to find something inside a large document. Based on the TJSONDocument I have implemented an experimental TJSONParser component that implements a SAX processing model for JSON. A bullet-proof SAX parser for JSON should be implemented from scratch and directly parse JSON text and fire relevant events. In my case it sits on top of the in-memory representation of JSON. The jsonparser unit contains the following enumerated type that lists different token types that can be found in a JSON text:
type TJSONTokenKind = (jsNumber, jsString, jsTrue, jsFalse, jsNull, jsObjectStart, jsObjectEnd, jsArrayStart, jsArrayEnd, jsPairStart, jsPairEnd);
There is also a declaration of a TJSONTokenEvent that is fired when a JSON token is encountered:
type TJSONTokenEvent = procedure(ATokenKind: TJSONTokenKind; AContent: string) of object;
Json Standalone Viewer In the next step I have used the TJSONDocument and TJSONTreeView components to implement a simple JSON Viewer application. The functionality is minimal. You can clear the current contents of the JSON viewer using Clear button, or you can copy to clipboard a JSON text and paste it into the viewer window using the Paste button. There is also a popup menu to control if children counts and node byte sizes are displayed or not. The application icon was created using IcoFX (http://icofx.ro/) directly from the JSON logo downloaded from the JSON home page. Below is a screenshot from Delphi JSON Viewer at runtime. Just copy some JSON text to the clipboard and paste
The TJSONParser component can do two things. If you call the FireTokenEvents public method, it will traverse the underlying JSON document and fire OnToken events for every token it encounters. It is also possible to build a token list in memory that can be accessed via the TokenList property. This could be useful if we would like to implement a JSON viewer using the Virtual Tree View component that requires fast access to the underlying data structure. What was interesting during the implementation of these two methods was the fact the underlying document traversal algorithm was the same for both firing the events and building the list. In order to avoid code duplication, I have decided to parameterize the traversal algorithm using anonymous methods. The following anonymous method signature was defined in jsonparser unit:
type TJSONTokenProc = reference to procedure(ATokenKind: TJSONTokenKind; AContent: string);
The signature of this method matches both the DoOnAddToTokenListEvent and the DoOnFireTokenEvent private methods in the declaration of the TJSONParser class. The actual document traversal algorithm has been implemented inside a DoProcess method that is called from both the FireTokenEvents and the BuildTokenList methods in the following way:
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
COMPONENTS
DEVELOPERS
Page 19
In this way we have both functionalities implemented without code duplication inside the recursive DoProcess method:
procedure TJSONParser.DoProcess(val: TJSONValue; aTokenProc: TJSONTokenProc); var i: integer; begin if val is TJSONNumber then aTokenProc(jsNumber, TJSONNumber(val).Value) else if val is TJSONString then aTokenProc(jsString, TJSONString(val).Value) else if val is TJSONTrue then aTokenProc(jsTrue, 'true') else if val is TJSONFalse then aTokenProc(jsFalse, 'false') else if val is TJSONNull then aTokenProc(jsNull, 'null') else if val is TJSONArray then begin aTokenProc(jsArrayStart, ''); with val as TJSONArray do for i := 0 to Size - 1 do DoProcess(Get(i), aTokenProc); aTokenProc(jsArrayEnd, ''); end else if val is TJSONObject then begin aTokenProc(jsObjectStart, ''); with val as TJSONObject do for i := 0 to Size - 1 do begin aTokenProc(jsPairStart, Get(i).JsonString.ToString); DoProcess(Get(i).JsonValue, aTokenProc); aTokenProc(jsPairEnd, ''); end; aTokenProc(jsObjectEnd, ''); end else raise EUnknownJsonValueDescendant.Create; end;
Summary JSON is currently probably the most important data interchange format in use. Its simplicity makes it easy to process, and information encoded with JSON is typically smaller than using XML. Over the years XML has become a whole family of specifications and it is not a trivial task to implement a fully compliant XML parser from scratch. Delphi 6 was the first commercial IDE on the market to introduce support for XML SOAP web services. Delphi 6 also introduced the TXMLDocument component and XML Data Binding Wizard to make it easier to work with XML. JSON so far lacks something equivalent to an XML Schema (which abstracts a cross-platform representation of XML metadata). However a JSON equivalent is slowly emerging.JSON. On the JSON home page you can find a reference to a draft version of IETF RFC A JSON Media Type for Describing the Structure and Meaning of JSON Documents [8]. This is still pending feedback but in future could be a starting point for implementing a Data Binding Wizard for JSON. In this article I have described a JSON Viewer application implemented with Embarcadero Delphi 2010. The source code that accompanies this paper is organized in the form of two packages for Delphi components one runtime and one design-time and the djsonview: Delphi VCL Forms application that implements the Delphi JSON Viewer. References 1. Source code for this article
http://cc.embarcadero.com/item/27788
2. JSON RFC
http://www.ietf.org/rfc/rfc4627.txt
5. JSON Examples
http://www.json.org/example.html
About the author: Pawe Gowacki is Embarcadero Technologies' European Technical Lead for Delphi, RAD Studio and All-Access technologies. Previously, Pawe spent over 7 years working as a senior consultant and trainer for Delphi within Borland Education Services and CodeGear. As well as working with Embarcadero customers across the region, he also represents Embarcadero internationally as a conference and seminar speaker. For more information check out Pawe's technical blog at http://blogs.embarcadero.com/pawelglowacki
The TJSONParser component can be used as starting point for arbitrary JSON processing at the lowest level of actual JSON text tokens. Delphi anonymous methods are real cool!
2010-11-30
After installation a number of shortcuts appear on the desktop. One shortcut is for the so called Enterprise Manager. This program supervises the different databases. The two other shortcuts are for help information. As shown below, I was walking on clouds....
Page 22 / 2256
Figure 4: The Enterprise Manager At this point my harddisc has a map C:\data\Northwind with a number of files. For each table there is a so called nx1 file. In the next release of NexusDB v4 later this year however, the database is housed in a single file.
Page 23
Figure 6: Datamodule sample embedded The component TnxDatabase has a property AliasPath which must specify the directory of the database. In this case it is:
C:\data\Northwind
Figure 8: Object Inspector of theTnxTable On the projects MainForm we notice the usual components such as DBGrid and DataSource. The DataSet property of the DataSource refers to the nxTable1 component on the datamodule. Next, if we set the ActiveDesignTime property of nxTable1 to True, the customers table data appears in the grid. Just as we are used to.
Figure 9: MainForm sample embedded On the end-users computer this application , as stated before, needs no separate database server. The database engine is linked directly into the exe file of the application. Not a single dll is required. The end-user only needs the executable of the application itself besides his database files. In this case c:\data\Northwind. Figure 7: Object Inspector of the TnxDatabase The TnxTable component in the datamodule refers to the Customers table. The TableName property should reflect that. Lazarus and Free Pascal The Nexus team is working hard on a NexusDB version for Lazarus and Free Pascal. Below, notice a preliminary screenshot of the designtime support in Lazarus. Later this year NexusDB v4 will be released. It is the intention to officially support Lazarus and Free Pascal shortly after that time.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
Page 24
Figure 10: Lazarus designtime support Conclusion NexusDB is a complete database with all possible options. It is fast, installs without pain and the user manual contains many examples. The data-access components included simplify the use of the database. Also a program is included to manage the databases. This Enterprise Manager is particularly user friendly. In this article I have mainly focussed on the Free Embedded Version of NexusDB. Using the clearly written user manual, it is fairly simple to design an embedded database application in Delphi. Besides that, NexusDB offers many advanced options such as in-memory tables, dynamic SQL and even a remotingframework. But at this time I do not master all options. However, the website discusses them all, together with many examples. I recomment everybody looking for a database to have a good look at NexusDB. Take a look at the adversiment from Nexux at page 2: there is an offer for 20% reduction!
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
About the author: Erwin Mouthaan was born in Utrecht, the Netherlands, in 1966. He studied applied mathematics at the University of Twente where he learned programming and the Pascal programming language. He worked for various research institutes, is currently self employed and very happy to use Delphi a lot.
Page 25
Nexus Database Systems - Support Policy changes Our current support (which was introduced shortly after the release of NexusDB V2.07 back in April 2008) is based on yearly product based support, which gives access to newly released minor upgrades of the product and provides general support according to the level (bronze, silver, gold or platinum). This scheme specifically excludes updates to major new releases for which an extra upgrade charge is applicable. With a new major and exciting RAD Studio version just released and NexusDB V4 being worked on, we feel that this licensing scheme is bringing a lot of customers into a difficult situation: Should they update their support now and have to pay for a V4 upgrade again? Or should they wait for V4 and stick with what they have already? Since having happy customers is our main goal, we decided to do something about it and change the licensing to include updates to major versions. This means that with active product support customers will have access to any new release of the product, no matter if its a minor or major upgrade.
What does that mean for your product support? How much does it cost?
There is NO change to the current prices for renewals due to this change of policy. This means that compared to the current scheme you will get significantly more for your money. Some products currently have no specific upgrade pricing, which means that the full license price is applicable for major upgrades (e.g. NexusDB Embedded SRC). To keep the pricing consistent we will provide discounted upgrade prices for these products and will add them to the price list once they come in effect after October 15th 2010. Please refer to
http://www.nexusdb.com/support/index.php?q=pricing
for an up to date full list of all prices. Are there other benefits than upgrades to new major versions? Yes there is at least one more major benefit. You will get new functionality earlier. The scheme of paid major upgrades dictated for us (Nexus Database Systems) which new functionality was released into minor upgrades and which we held back for major upgrades. This led to situations where functionality was held back until the next major upgrade even though it was essentially ready for release. Since new features are now not a driving force for a major upgrade anymore, we can now roll out new features as they become ready. This has the immediate benefit for you, our customer, that you dont have to wait for feature X until features Y and Z are finished. Another effect is that we can concentrate on working on the current version only, instead of having to work on a new version and provide support for one or two older major releases. This has immediately obvious benefit that all effort and energy is channeled into making the product better faster. Another, not so immediately visible benefit of this is also the lower administrative effort necessary starting from branches in version control, over mailings, to pre- and after sales support. It all will get much easier to manage. In short we might lose some money generated by paid for major upgrades but we are sure that this change will allow us make our products better faster and to serve our customers in the best possible way, which in turn will sell us more licenses.
Your support will still be product based. You will get access to all product updates released (minor AND major) within the time your product support is active. You are eligible for product support according to your level (bronze, silver, gold or platinum) within the time your product support is active. After your support expires you will get a grace period of 1 month to renew your product support at a renewal price. The 1 year period starts with the date of expiry. If you let the grace period laps you can renew your product support at the normal upgrade price. The 1 year period starts with the date of purchase. We will send you 3 reminder messages: 1 month before expiry, at expiry, 2 weeks after expiry.
What happens if your product support is expired?
You will still have access to all binaries and installers released before your support expired. You can still create and distribute your own products with the versions that you have access to (we do not revoke usage and distribution rights). You will NOT get access to product updates released after your support expired. Product You can renew your product support for 1 year after the 1 month grace period for the normal upgrade price. The 1 year NexusDB Developer SRC period starts with the date of purchase. NexusDB Developer DCU NexusDB Embedded SRC If your product support is expired right NexusDB ADO Provider now, we give you an extra extended grace NexusDB ODBC Driver period until October 15th 2010 to renew NexusDB PHP Connector your support at the renewal price. Nexus Portal Pro Thereafter the 1 month grace period applies. Nexus Portal Std
New License
Renewal within or before grace period AUD AUD AUD AUD AUD AUD AUD AUD 500 300 200 300 300 300 790 520
Upgrade / Renewal after grace period AUD AUD AUD AUD AUD AUD AUD AUD 650 400 300 350 350 350 950 625
Page 26
GR
T NE EA AT NE
E
GR
by Bruno Fierens
Creating a project in TMS Data Modeler There are two ways to start a project in Data Modeler: creating a new project from scratch or importing data dictionary from an existing database.
Introduction Data modeling is a mandatory requirement throughout the lifetime of a database system: from the initial design of the software, when the first structure is created and modeled, to later on, in system updates, when database structure is modified. There are also situations where manipulation of database structure may be a complicated task, such as when there is a need to get a productive system and to work on a non-documented database, or convert a database from a DBMS to another. There are several tools related to data modeling on the market: some DBMS-specific, others generic; some useful to specific and isolated tasks, others offering a multitude of features (and usually not very cheap). TMS Data Modeler is a tool that provides nothing but essential features for creating and maintaining a database: it integrates database design, modeling, creation and maintenance into a single environment, in a simple and intuitive user interface to manipulate databases efficiently. This article briefly describes the main features of TMS Data Modeler, demonstrating how it may be used to create a project and maintain an existing database. Data Modeler main features TMS Data Modeler is a generic tool that allows data modeling independently of the used DBMS, in an easy-to-use interface. The application allows you to start modeling a database from scratch, as well as import the structure of an existing database (reverse engineering). It allows you to generate scripts to create the full database, or upgrade an existing database with update script, through its version control system. In addition, it provides features for conversion from one DBMS to another, consistency check and visualization of entity-relationship diagrams, among others. Data Modeler supports several database management systems, currently: Absolute Database, Firebird 2, MS SQL Server 2000/2005/2008, MySQL 5.1, NexusDB V3 and Oracle 10g.
Figure 1: Starting Choosing the "New Project" option, just select the target database and an empty project will be created. By default Data Modeler provides a diagram named "Main Diagram". Tables and relationships between them can be created visually through the diagram. All objects in the database (apart from tables we can have procedures, views, etc.) can be accessed, created and edited through the Project Explorer, located on the left of the screen. In the "Import from Database" option you need to configure the connection to the database whose structure will be imported. After importing the structure, Data Modeler will hold all the database objects: tables, relationships, triggers, procedures, views, etc. All objects are listed at Project Explorer on the left, in their respective category. For an overview of the imported structure, it is possible to open the "Main Diagram" and select "Add all tables" from the context menu.
Page 27
Figure 3: Versioning
Page 28
Figure 5: Editing relationship Version upgrade After the changes are made in the project, version 2 becomes different from version 1, which was archived right after importing the structure from the database. Using the comparing versions tool of Data Modeler, we can view the structure of each version side by side, with their differences highlighted (created, removed or changed objects), and the creation script of selected objects. On the same screen we can select the changes to generate a script to update the database.
Figure 6: Clicking on "Generate", we have our script ready to update the database from version 1 to version 2, containing all alterations.
Page 29
Figure 7: Auto increment Reading the structure of a Data Modeler project from your application. TMS Software offers a free library, Data Modeler Library (DMLib), which allows access to the structure of a database stored in a TMS Data Modeler project, from any application in Delphi or C++Builder. It is a collection of read-only classes containing clear methods and properties for obtaining information about all objects from the data dictionary. Here is a small example of how to use DMLib for getting the fields and their data types from a specific table in the data dictionary:
program DMread; uses SysUtils, uAppMetaData, uGDAO; var amd: TAppMetaData; table: TGDAOTable; field: TGDAOField; i: integer; begin amd := TAppMetaData.LoadFromFile ('C:\tmssoftware\dmlib\jedivcs.dgp'); try table := amd.DataDictionary.TableByName('attachments'); if table <> nil then begin for i := 0 to table.Fields.Count-1 do begin field := table.Fields[i]; Writeln(Format('Field %d: %s [%s]', [i + 1, field.FieldName, field.DataType.Name])); end; end else Writeln('Table not found.'); finally amd.Free; end; end. Output: Field 1: Field 2: Field 3: Field 4: Field 5: Field 6:
About the author Bruno Fierens He started doing several small projects in the mid-eighties in GWBasic and soon after discovered Turbo Pascal v3.0 and got hooked to its fast compilation, clean language and procedural coding techniques. Bruno followed the Turbo Pascal releases and learned object oriented programming when it was added to the Pascal language by Borland. With Turbo Pascal for Windows and Resource Workshop, he could do his first steps in Windows programming for several products for the local market. TMS software became Borland Technology Partner in 1998 and the team grew to 4 persons in the main office in Belgium and developers in Brazil, Uruguay, India, Pakistan doing specific component development. TMS software is now overlooking a huge portfolio of Delphi components and looks forward to strengthen this product offering in the future. With Delphi 2010, Embarcadero now offers a very rich and powerful environment for creating fast and solid Windows applications using the latest technologies in Windows 7 such as touch. Bruno said he will watch the announced cross-platform development tools from Embarcadero closely and TMS software is hopeful this will bring exciting new opportunities for Embarcadero, Delphi and our components. We live indeed again in very interesting times for passionate Delphi developers.
idattachment [Int (identity)] description [VarChar] filename [VarChar] createdon [Datetime] content [Image] PROJECTID [Int]
Special offer for Blaise Pascal Magazine Subscribers: 20% discount for TMS Data Modeler 76,00
(standard price 95,00) offer valid until end of November 2010 Coupon code: DM-BLAISE to be used on the online order form: https://secure.element5.com/register.html?prognr= 300398512&languageid=1 (just click on the url)
Page 30
by Cary Jensen
This is the first article in an extended series of articles taking a look at database application development in Delphi. In this installment, Delphi database expert Cary Jensen takes a look at databases in general, and provides a general introduction to Delphi's support for database applications. Here's a trivial question: What was the name of Delphi during its original beta test (prior to it's February 1995 release)? The answer is Delphi (but you might have already known that). But why did the development team pick such an odd name for their ground breaking, component-based advancement of their Pascal compiler and IDE (integrated development environment). The answer is related to databases. Delphi is name of both the city and temple in Greece where people would travel to speak to the Oracle (a reference to the ORACLE Database Server, if that wasn't obvious). And although Delphi can work with just about any database you can think of, the point is that it was designed from the start to be a great environment for developing database applications. While database development has not always been considered the most glamorous area of software development (that's other people talking, not me, I think database development is quite glamorous, frankly), a strong argument can be made that it is the most important, with respect to how it affects our daily lives. It is nearly impossible to go a day without having some interaction that involves a database. Whether you are making a purchase at the market, checking into a hotel, catching an airline flight, or withdrawing money from an automatic teller machine, data needs to be collected, and in most cases, used to ensure that your experience is positive or to make it better in the future. And that's where our job comes in. As database developers, we are responsible for understanding where the data comes from, where it goes, and how it needs to be used. And that understanding helps us to create software that helps people be more productive and makes their lives better. (1) From the Beginning To begin with, I'm a real believer in foundations. I feel that the more you know about the fundamentals of the tools that you use, the better you'll be able to use them. Unfortunately, when we're talking about a mature product like Delphi, there are fewer resources available now than there were in the early years of Delphi. And this poses a problem, in part because there is new growth in the Delphi market. Not only is there a need for new Delphi developers to help support existing projects, but Delphi has emerged as one of the leading development tools for native Windows development, and that particular area of development is not going away. And that's where this series comes in. Though I were many hats, in my heart I am a database developer, and have been deploying multi-user database applications since the 1980's, and doing so in Delphi since it shipped.
(1)What is a Database? A database is a mechanism for storing and retrieving data. That's all, really. In the simplest of worlds, a text document can be a database. XML is a text format, and many do use it as a database. A database application, on the other hand, is much more. Most database applications assist in the collection of data, the manipulation of that data, and turn that data into information (reports, charts, actions, and so forth). These application, however, do require a database, but I'm starting to get ahead of myself here. In most cases, the data of a database is structured, which is to say that it is organized. In this regard, text documents often fall short. As a result, database developers often rely on something else. For the purpose of brevity, I am going to over simplify this and say that most Delphi developers rely on three types of databases: custom file structures, local file system databases, and remote database servers. Yes, there are others, but in some respects they are variations (or even combinations) of one or more of these. (2) Custom File Structures A database based on a custom file structures can make use of either simple text or binary files. In most cases, these files are highly organized. For example, each individual piece of data may be separated from other pieces of data by a particular character, or separator. An example of such a file is a comma separated values (CSV) file, which is a common text format. It's not necessary for data to be separated by characters. Specifically, if you know that each piece of data takes up four bytes in the file, you can retrieve the individual data values by parsing the file, snipping off four bytes at a time. You would also store this data in the file in the same way, writing each value as a four byte chunk. With Delphi, some developers create files that contain a series of record structures, where by record structures I literally mean Delphi record types. These files are called typed files, since they are files of a Delphi type: records, in this instance. (For a nice introduction to using typed files, see Zarko Gajic's article at http://delphi.about.com/od/fileio/a/fileof_delphi.htm.) And these are just a few of the options for working with custom file structures. One of the advantages of custom file structures is that you can usually read and write them very fast. In addition, your Delphi applications that use these files normally rely on nothing more than Delphi's internal file IO (input/output) capabilities. By contrast, most of the other database approaches rely on external files, such as client DLLs. Figure 1 shows a simple diagram that depicts the interaction between a Delphi application and custom data structures.
Data
There is a downside, however. To begin with, these types of database are almost always single user, meaning that only one application or person can work with the data files at a time. Therefore, I am going to take this opportunity to start from the Sure, you could devise a mechanism by which two or more applications beginning. Before the beginning, actually, if you think about it. (or people) can share this data at the same time, but that would be I am going to start with a brief overview of databases. Towards the end extremely complicated, and is almost never worth the effort (since good, of this article I will discuss the basics of Delphi's database related multiuser solutions already exist). classes. Second, these types of databases are typically proprietary, which means that your applications, and only your applications, can read the data. In future articles in this series I will go into detail about Delphi's many While you may find this desirable, it is nonetheless a limitation. database-related tools, including ClientDataSets, multi-tier development with DataSnap, Cloud computing, and more. But for now, we'll begin at the beginning.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
COMPONENTS
DEVELOPERS
Page 31
DLL
DLL
DLL
NETWORK
DLL
Data
DLL
DLL
DLL
NETWORK
DLL
Figure 2: A file server database While file server databases offer a variety of benefits over custom file structures, they have their limits, especially when compared with feature-rich remote database servers. I'll focus on just two limits, and these are related to network bandwidth and database stability. When two or more clients need to share data from a file server database, that data must be placed in a network location accessible to all clients. When those clients need to read the data, the data must be transferred across the network. When many clients are reading and writing data, this can mean that a large amount of data is moving around on the network. Actually, it's worse than it sounds. For example, if your client application is searching for a particular piece of data, such as the information about a specific person, some data about all of the people needs to be transferred across the network so that the client application can read through each person's data, searching for the one of interest. In other words, if data about a million people is stored in your file server database, and your client is searching for one particular person, it is likely that some data about all one million people will be transferred across the network, and that is only for one client application (I say some data, since most databases make use of indexes. I'll discuss indexes in more detail in the next article in this series). Consider what happens when twelve different client applications, on twelve different workstations on the network are each searching for one person from the database. I think you get the picture. As far as stability goes, file server databases lack centralized control of the data, and, as a result, are prone to data corruption. In a file server database, each and every client application can read and write the data stored in the shared files on the network. All it takes is for one of these client applications to have a problem during a write operation (such as being unplugged from the network) and the database can become corrupt. This potential for corruption increases in direct proportion to the number of client applications writing to the database. Even if there is only one client writing to the database, an error during a write operation can render the underlying database unusable. (And this is why backing up your data is so very important. You cannot predict when a problem like this will be encountered.) (2)Remote Database Servers A remote database server is an application that manages your database. When you write a client application that involves a remote database server, your client application does not read or write data directly from files. Instead, it makes all of its requests for data through the remote database server. This general architecture is referred to as client/server architecture. This distribution of responsibilities produces three primary benefits. First of all, it distributes the processing of data across several machines
SERVER
Figure 3 A client/server database The second benefit is a dramatic reduction in network traffic. Consider the previous example where a client application needs to locate a single person in a file of a million people. In a client/server scenario, the client requests the single person from the database server, which searches the data files, returning only the one located person (if found) across the network to the client. That's almost a million to one reduction in network traffic. Finally, the client/server architecture is profoundly more stable than file server databases. This is because the remote database server can perform any requested data writes in a highly controlled fashion, namely by using transactions. For example, after the remote database server receives a properly formed request for a write operation from the client, the server begins a transaction. (And if the client's network connection fails during the write request, the request will not be well-formed, and therefore will be ignored by the server.) The transaction on the server normally involves the server making a note of what it wants to write, attempts to write the data, and then erases the note (I'm really simplifying this here, but you get the picture). If something goes wrong during the server's write operation, the server uses it's notes to either complete the write request (if possible) or to restored the data to its original state. As a result, it is very hard to corrupt data that is managed by a remote database server. (1) Delphi and Databases If you want to use custom file structures in your database applications, this is something that you do manually. Specifically, you write the procedures for creating and populating your data structures (which might be record types), create the procedures for reading and writing your data, and all things in between, including displaying your data structures in your using interface, detecting a user's changes to that data, and populating your data structures from the user interface for the purpose of writing back to the custom files. Sure, you can encapsulate all of this in custom components, but the bottom line is that you are responsible for every aspect of the process. If you are using one of the supported file server databases or remote database servers, it's a completely different story.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
Page 32
COMPONENTS
DEVELOPERS
TSQLConnection, TIBConnection, and TAdsConnection. There are other TDataset classes as well, and they play their appropriate roles, but those are not relevant to the current discussion. While Delphi 1 included only one set of TDataset descendants, Delphi 2010 includes no less than 6 sets (I'll go into these in some detail in a future article in this series). In addition, a number of database vendors provide their own implementations of TDataset descendants. And through these, you can access data in an amazing range of databases, including Paradox, dBase, MS Access, MS SQL Server, ORACLE, DB2, MYSQL, SQL Anywhere, Sybase ASE, InterBase, Firebird, BlackfishSQL, Advantage Database Server, and many more. Delphi's Data-Aware Controls While the various TDataset classes give you access to a remarkable collection of databases, it's Delphi's data aware controls that make it simple to build user interfaces that permit your users to interact with the underlying data. Some of these classes, such as the DBNavigator and DBGrid, point to an entire TDataset, while others, such as TDBEdit and TDBMemo, refer to a single field of a TDataset (I'll go into more detail about fields and datasets in the next article in this series, but for now suffice it to say that some data aware controls refer to collections of data, while other refer to single data points). In many cases, these controls not only display data, but permit your users to change the data. For example, a DBGrid can be used to display most types of textual data from a database, as well as edit this data (so long as the underlying TDataset permits editing). An example of a DBGrid displaying data from a database is shown in Figure 4.
Figure 4 Data displayed in a DBGrid Other controls, such as the DBImage and DBLabel, display data but do not permit you to modify it. Finally, the DBNavigator control doesn't display data at all, but instead provides you with a convenient means of moving forward and backwards in your database. (The DbNavigator supports additional operations, such as placing a TDataset in the edit mode, posting changes, canceling changes, to name a few, but only if the underlying TDataset allows these operations.) Before continuing to the final component that makes up Delphi's basic support for the development of database applications, the TDataSource, a comment about data-aware controls is appropriate. Some Delphi developers do not approve of the use of Delphi's data aware controls, preferring instead to control the many aspects of the user interface, as far as database data goes, manually. I don't want to go into the detailed arguments on both sides of this position here. Instead, I want to simply say that Delphi's data aware controls are easy to use, and provide many of the capabilities that you want in most of your basic user interfaces. What is important, though, is that if you want something more than the data aware controls offer, you are free to use a wide variety of alternative techniques.
COMPONENTS
DEVELOPERS
Page 33
If the TDataset complies, the DBGrid will accept your keystrokes. On the other hand, if the underlying TDataset is readonly, it will reject the request to enter the edit mode, and the DBGrid will not be permitted to accept the data entry. Figure 5 depicts the relationship between the BDE TDatasets, the TDataSource, and data aware controls.Figure 5 The relationship between BDE TDatasets, TDataSource, and data aware controls The second role played by TDataSources is to act as an intermediary between two or more TDatasets. In many respect, this is similar to the role it plays when interacting with data aware controls, but different in that no user interface elements are involved. I'm not going to say any more about this second role at this time, but I wanted to at least acknowledge that it exists. (1)Summary In this first of an extended series of articles on Delphi database development, I have started at the beginning. This article began with a general look at databases, with a brief discussion of the most common types of databases supported by Delphi. It continued with a general overview of Delphi's data-related components. In the next article in this series, I will discuss the various types of data structures that databases contain, including tables, indexes, views, and stored procedures. I will then show you how you use Delphi's datarelated components to work with these types of data.
DATABASE (Default)
DATASOURCE
About the Author Cary Jensen is Chief Technology Officer of Jensen Data Systems, a consulting, training, development, and documentation and help system company. Since 1988 he has built and deployed database applications in a wide range of industries. In addition, Cary is the best selling author of more than 20 books on software development, and winner of the 2002 and 2003 Delphi Informant Reader's Choice Award for Best Training. A frequent speaker at conferences, workshops, and seminars throughout much of the world, he is widely regarded for his self-effacing humor and practical approaches to complex issues. Cary has a Ph.D. from Rice University in Human Factors Psychology, specializing in human-computer interaction.
The Lazarus Complete Guide will be available in mid December 2010. You can order it at ourweb shop direct. If you pre order the LAZARUS COMPLETE GUIDE you will have a Lazarus USB stick for only 15.00
Page 34
COMPONENTS
DEVELOPERS
Graphical User Interface programs for Windows 32 and 64 Windows CE, Mac OS, Unix and Linnux
Scalable Advantage comes in two basic flavors: the Advantage Database Server (ADS) and the Advantage Local Server (ALS). ALS is a free, file-server based technology whose API (application programming interface) is identical to ADS. ALS permits developers to deploy their Advantage applications royalty free to clients who do not need the stability and power of a separate database server. Importantly, as the needs of those applications deployed with ALS grow over time, those applications can be almost effortlessly scaled to client/server technology, in many cases simply by deploying ADS. So long as the client applications are designed correctly, those applications will begin using ADS the next time they execute. New Features and Enhancements in Advantage 10 Rather than reciting a laundry list of updates, I have organized the enhancements into the following sections: Major performance improvements, enhanced notifications, additions to Advantage SQL, nested transactions, Unicode support, additional 64bit clients, added design-time support for Delphi, and side-by-side installation. For a detailed listing of all of the updates found in Advantage 10, see the white paper at the following URL:
http://www.sybase.com/files/White_Papers/Advantage_W hatsNewADS10_WP.pdf
Ma jor Performance Improvements The Advantage Database Server has always been recognized for its superior performance, being able to handle very large amounts of data with blinding speed. That makes it all the more remarkable that one of the most enticing aspects of upgrading to Advantage 10 involves performance. Specifically, the performance of database operations in client applications will improve simply by upgrading the server to Advantage 10. In some cases, these performance gains will be significant. Many of the internal systems that contribute to Advantages already impressive performance were evaluated by Advantages R&D engineers. Where possible, improved algorithms were introduced, caching was implemented or enhanced, and resources were pooled. These changes resulted in more efficient indexes, improved transaction handling, and more intelligent management of resources such threads, record locks, and file writes. The effects of these improvements range from nice to stunning. During Advantage 10s Beta cycle, one of the Beta testers reported the results of his performance tests on some of his larger queries involving, in some cases, millions of records. He found that some Advantage 10 queries executed 40 percent faster than the same queries in Advantage 9. In other cases, the Advantage 10 queries were exponentially faster (one query that ran in 2.7 seconds in Advantage 9 took about 1 millisecond in Advantage 10). The R&D team has found similar improvements during testing. But SQL queries are not the only area of Advantage to benefit from these internal improvements. Operations that benefit from Advantages support for navigational operations have also improved. In fact, the Help files for Advantage 10 list no less than 20 specific improvements or optimizations introduced in Advantage 10. And these updates affect everything from cascading referential integrity updates to record insertion, from memo file header updates to table creation, from low-level index operations to worker thread management. Simply put, the performance enhancements introduced in Advantage 10 alone make a solid business case for upgrading from an earlier version of Advantage. Enhanced Notifications Notifications are a feature originally introduced in Advantage 9, and they provide you with a mechanism by which Advantage can notify interested client applications that some change has occurred on the server. For example, a client application can subscribe to a notification in order to be informed when the contents of a specific table have changed. The client application can then use this information to update the end users view of that data. A small change to notifications in Advantage 10 has resulted in a very significant improvement in their utility:
COMPONENTS
DEVELOPERS
Page 35
This is no longer the case. As a result, if you write a stored procedure whose operations should be performed in a transaction, you can safely call BEGIN TRANSACTION, even if that stored procedure is called by code where a transaction is already active. New Table Features Several interesting new table-specific features have been introduced in Advantage 10. Several of these are related to transactions and table caching. Lets consider table caching first. To begin with, so long as memory resources permit, temporary tables are now kept entirely in cache. As a result, operations that rely on temporary tables are usually very fast. There is also a new table property called Table Caching. Most tables are created with Table Caching set to None. These tables are not cached, and any changes to these tables are written to the underlying file immediately. When Table Caching is set to either Read or Write, the corresponding table is kept in cache while it is open, making its data highly available. These settings are normally used for data that is largely static, and which can be reconstructed if the table becomes corrupt. Specifically, tables held in cache are not written to disk except when the table is closed. As a result, changes to their data will be lost if Advantage unexpectedly shuts down without being able to persist those tables contents (for instance, if there is a sudden failure of your servers power supply). However, this functionality can be very useful for static data (zip codes, part numbers, and so forth). The transaction free tables feature is also a table property, called Trans Free Table. When set to True, the associated table does not participate in transactions. There are two implications of a table not participating in an active transaction. First, changes made to a Trans Free Table during a transaction are not rolled back even if the transaction itself is rolled back. Second, changes to data in a Trans Free Table are not isolated during a transaction, being immediately visible to all other client applications, even though the transaction has not yet been committed. Just like when a tables Table Caching property is set to Read or Write, Trans Free Table is set to True only for special tables in most applications. For example, you may use a table to log a users actions in an application. In those cases, you may want to log that a user tried to perform some task, even though the action may fail and the users changes may be rolled back. Similarly, you may have a table used for generating unique key field values. This table may have a single record and single field that holds an integer value. A client needing a key would lock this table, read the key, increment the integer, and then release the lock. With such a table, the incremented key needs to be visible to all client applications, even if individual clients increment the key from within a transaction. If such a table were not a Trans Free Table, other clients would not be able to access the incremented key until the transaction was committed, rendering the table useless for its intended purpose. Unicode support Although Unicode support is arguably a table feature, its significance warrants separate consideration. In short, Advantage 10 introduces three new field types. These types, nchar, nvarchar, and nmemo, are UTF-16 Unicode field types. The nchar type is a fixed length Unicode string field and nvarchar is a variable length Unicode string field. The data for these two field types are stored entirely in the table file. The nmemo field, by comparison, is a variable length Unicode field that is stored in the memo file. Together, these three fields provide you with a number of options for storing Unicode symbols and characters in Advantage tables.
* *
In previous versions of Advantage, you would have to form your query like the following:
SELECT FROM CUSTOMER WHERE Active
True;
Also, TOP queries now support a START AT clause, which permits you to select a specific number of records beginning from some position in the result set other than the top. For example, the following query will return records 11 through 15 from the CUSTOMER table, ordered by last name.
SELECT TOP 5 START AT 11 FROM CUSTOMER ORDER BY LastName; A collection of bitwise SQL operators have also been introduced. These include AND, OR, and XOR, as well as >> (rightshift) and << (left-shift).
There is also a new SQL scalar function: ISOWEEK, which returns the ISO 8601 week number for a given date (it is also a new expression engine function). And, some of the SQL scalar functions that were previously not expression engine function are now. These include DAY, DAYOFYEAR, DAYNAME, and MONTHNAME, to name a few. These are in addition to CHAR2HEX and HEX2CHAR, which are newly added expression engine functions. Support in the expression engine means indexes can now be created using these functions, which in turn allows the Advantage query engine to fully optimize restrictions that use these scalars. Finally, there are a number of new system stored procedures and system variables. The following are just a few of the new system stored procedures available in Advantage 10: sp_SetRequestPriority, sp_GetForeignKeyColumns, and
sp_IgnoreTableTransactions.
Nested Transactions Speaking of nested transactions, Advantage 10 now supports them. In previous versions of Advantage, code executing in an active transaction could not attempt to start a transaction without raising an exception.
Page 36
COMPONENTS
DEVELOPERS
A single PDF bundling together 3 ebooks: Delphi 2007 Handbook, Delphi 2009 Handbook, and Delphi 2010 Handbook. The material has not been edited, it is simply the merging of the three original ebooks. The collection of books covers all of the new features of Delphi 2010 since Delphi 7, covering the IDE, the Delphi language, Unicode support, Windows development, new VCL components, database access, DataSnap, and much more. In total the combined book has almost 1,000 pages of Delphi-related content, in a single easy-to-search PDF file.
http://sites.fastspring.com/wintechitalia/product/delphihandbookscollection DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
COMPONENTS
DEVELOPERS
Page 37
LAZARUS
ALL ACCES
COMPONENTS
DEVELOPERS
4
DELPHI
DB ARTISAN ER/STUDIO
RAPID SQL
DB CHANGE MANAGER
If you take out a new subscription we will offer you a discount of 5,00 per subscription. We have some special offers for our subscribers: you will find these extra offers for them on page 26 from Nexus, from components4developers on page 72 from TMS software on page 27.
Page 38
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
GR
T NE EA AT NE
E
GR
Real-time datacollection
starter
In engineering science you often need to make physical data simultaneously available to more then one application. To simplify the programming of the the following example illustrating how to do this, an Interbase / Firebird database is used as it provides marshalling, atomicity, isolation and durability. This article describes a modular approach from sensing temperature data to storing the data in a database. It will also serve to document how to write a program for using with the DelphiDevBoard (see page 42).
DRIVER
TInputs TInputs
8 x per second
I/O Server
Temp Sensor
M485A Server
RS232
USB
PC
DRIVER
dta_collector.exe
fbclient.dll
Figure 1: The application The signal route from sensor to data storage takes this pathway: 1) Sensor 2) Analogue to digital converter 3) Transmitter 4) Data collection application 5) RDBMS relational data base management system Figure 2: The organsation of the system
1. Sensor.
In this example a temperature is to be measured with a resolution of at least 0.1 degree Celsius and recorded together with including a date and time stamp. The sensor DS18B20 from Maxim was chosen as it has a measuring range of -55 to +125 degrees Celsius with a resolution of 1/16 degree. This satisfies the required specification. 2. Analogue to digital converter. The DS18B20 has a built-in 12 bit A/D converter. The digital data ranges from %1111 1100 1001 0000 = $FC90 = -880 equivalent to -55.0 degrees Celsius up to %0000 0000 0000 0000 = $0000 = 0 for zero degrees and beyond to %0000 0111 1101 0000 = $07D0 = 2000 equivalent to + 125 Celsius. I.e. the digital value from the converter is 16 times the Celsius temperature 3. Transmitter. The DelphiDevBoard (see page 42) is used as the transmitter. The standard firmware can be used to implement the transmitter. It contains these components: A driver for the temperature sensor. It receives the digital data from the sensor and provides the data as an integer. I/O Server. This background application fires 8 times per second, calling the driver and writing the received integer in a memory pascal record : Var Inputs : Tinputs; the location where it is stored is Inputs.Temp. A M485aServer, running in the background, stands-by to provide memory data on incoming requests through the RS232 port. The open protocol used is M485a.
4. Data collection application.
dta_controller.dpr
Presentation Layer UGUI.pas
UDmain.pas
UDB.pas
fbclient.dll
lib485a.dll
DRIVER
To collect the data a PC is used running Windows and a Delphi application. As most modern computers don't have RS232 ports a USB dongle is used to connect to the transmitter. To ease the interfacing with the transmitter the DelphiDevBoard comes with the lib485a.dll implementing the M485a protocol. The task of this application is to get the temperature, add a date and time stamp and store it in a database, for which we use Interbase/Firebird. This modular Delphi project dta_collector.dpr contains four layers i.e. Presentation layer Business layer Communication layer Persistence layer
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
by Anton Vogelaar
Presentation layer. The source code of the GUI can be read in UGUI.pas and the screenshot shows how the controls are positioned. The controlling of this application is handled by a controller instance of TContr defined in the business layer. The controller is instantiated in the OnCreate method of the main form (named GUI), and released in its OnDestroy method. In the top-left corner is a SpeedButton with a red / green glyph indicating the on/off line status. The OnClick event calls the GoOnline and GoOffLine private methods of the main form (GUI). These methods enable and disable the visibility of a panel showing the measured data and call the Start and Stop methods of the controller. This class also provides the public method Refresh to set the visual controls with numbers as obtained from lower layers.
Page 39
Realtime datacollection(continuation 1)
Unit UGUI; (* ======= Interface ===== *) Interface Uses Windows, Messages, SysUtils, Variants, Classes, Graphics, Controls, Forms, Dialogs, StdCtrls, ExtCtrls, Buttons, ToolWin, ComCtrls, UDomain; Type TGUI = Class (TForm) Unit UDomain; Interface Uses ExtCtrls, SysUtils, UDB;
TBar : TToolBar; BtnGo : TSpeedButton; PnlMain : TPanel; LbTemp : TLabel; LbPotm : TLabel; ShTot : TShape; ShTemp : TShape; ShPotm : TShape; Label1 : TLabel; Label2 : TLabel; Label3 : TLabel; LbLog : TLabel; Shape1 : TShape; Procedure BtnGoClick (Sender : TObject); (Sender : TObject); Procedure FormHide (Sender : TObject); Procedure FormShow Private Contr : TContr; Procedure GoOnLine; Procedure GoOffLine; Public Procedure Refresh (STemp, SPotm, SLog : String); End; Var GUI : TGUI;
TContr = Class Type Private Timer : TTimer; DB : TDB; NLog : Integer; Procedure DoTimer (Sender : TObject); Public Constructor Create; Destructor Destroy; Override; Procedure Start; Procedure Stop; End; (* =========== Implementation ============== *) Implementation Uses UGUI;
(* ======= Implementation ============================= *) Implementation {$R *.dfm} (* ===== Private ================================= *) Procedure TGUI.GoOnLine; Begin Try Contr.Start; PnlMain.Visible := True; Except GoOffLine; Raise End; End; Procedure TGUI.GoOffLine; Begin BtnGo.Down := False; PnlMain.Visible := False; Contr.Stop; End; (* ==== Public ================================= *) Procedure TGUI.Refresh (STemp, SPotm, SLog : String); Begin LbTemp.Caption := STemp; LbPotm.Caption := SPotm; LbLog.Caption := SLog; End; (* ====== Form ================================== *) Procedure TGUI.FormShow (Sender : TObject); Begin Contr := TContr.Create; End; Procedure TGUI.FormHide (Sender : TObject); Begin GoOffLine; Contr.Free; End; Procedure TGUI.BtnGoClick (Sender : TObject); Begin If BtnGo.Down Then GoOnLine Else GoOffLine; End; (* ==== End =================================== *) End.
TStr80 = String [80]; Type (* =========== DLL procedures ============== *) Procedure M485a_Open (Port : Integer); External 'lib485a.dll'; Procedure M485a_Close; External 'lib485a.dll'; : TStr80); Procedure M485a_ProdID (Var ProdID External 'lib485a.dll'; : TStr80); Procedure M485a_ProdIDF (Var ProdID External 'lib485a.dll'; (Var MainVars : TStr80); Procedure M485a_Vars External 'lib485a.dll'; (N, MAddr : Word; Var Buf); Procedure M485a_RdRam External 'lib485a.dll'; (N, MAddr : Word; Var Buf); Procedure M485a_WrRam External 'lib485a.dll'; (N, MAddr : Word; Var Buf); Procedure M485a_RdEe External 'lib485a.dll'; (N, MAddr : Word; Var Buf); Procedure M485a_WrEe External 'lib485a.dll'; (* =========== Public ====================== *) Constructor TContr.Create; Begin Inherited; DB := TDB.Create; DB.Open ('192.168.0.16:/db/test.fdb'); End; Destructor TContr.Destroy; Begin DB.Close; FreeAndNil (DB); Inherited; End; Procedure TContr.Start; Begin M485a_Open (1); Timer := TTimer.Create (Nil); Timer.Interval := 1000; Timer.OnTimer := DoTimer; Timer.Enabled := True; End; Procedure TContr.Stop; Begin If Timer = Nil Then exit; Timer.Enabled := False; FreeAndNil (Timer); M485a_Close; End; (* =========== Timer ======================= *) Procedure TContr.DoTimer (Sender : TObject); S : TStr80; Var STemp, SPotm : String; ITemp, IPotm : Integer; Begin If Not Timer.Enabled Then Exit; M485a_Vars (S); STemp := Copy (S, 3, 4); SPotm := Copy (S, 7, 4); ITemp := StrToInt ('$' + STemp);IPotm := StrToInt ('$' + SPotm); STemp := Format ('%.1f C', [ITemp / 16]); SPotm := Format ('%.1f %%', [IPotm / 10.23]); Inc (NLog); GUI.Refresh (STemp, SPotm, IntToStr (NLog)); DB.Save (ITemp, IPotm); End; (* =========== End ========================= *) End.
Page 40
Realtime datacollection(continuation 2)
Business and Communication layer. Both business and communication functionality is implemented in the unit UDomain. As Windows is an event based operating system the controller class contains a timer instance firing the OnTimer event every second. This event is linked to the DoTimer method. In TContr.DoTimer the procedure M485a_Vars (S) is called. This procedure resides is the lib485a DLL and returns a string representation of the Inputs record in hexadecimal format. The temperature is four hexadecimal characters long from position #3. Itemp := StrToInt ('$' + Copy (S, 3, 4)); returns the temperature as an integer in multiples of 1/16 degree Celsius where the '$' character forces the StrToInt procedure to treat S as a string of hexadecimal characters. This value is passed to the GUI as a string created by Format ('%.1f C', [ITemp / 16]) the method GUI.Refresh is called. This value is also to be stored in the database. Since all database actions are encapsulated in class TDB it is sufficient to call DB.Save (ITemp). Instantiating an releasing the DB object in handled by the controllers Create and Destroy events. Persistence layer. This layer contains all the functionality required to save the measured data in an Interbase / Firebird database. Instances of TIbDatabase, TIbTransaction and TIbSql are used since TIbSql is a lightweight communication class. Objects of these classes are instantiated by the TDB.Open and TDB.Close methods. The actual storage functionality is implemented in the TDB.Save method and embedded into a transaction.
Unit UDB; Interface ExtCtrls, SysUtils, IbDatabase, IbSQL, Classes; Uses TDB = Class Type Private IbDb : TIbDatabase; IbTr : TIbTransaction; IbSQL : TIbSQL; Public Procedure Open (DbName : String); Procedure Close; Procedure Save (ITemp, IPotm : Integer); End; (* =========== Implementation ============== *) Implementation (* =========== Public ====================== *) Procedure TDB.Open (DbName : String); Begin IbDb := TIBDatabase.Create (Nil); IbTr := TIBTransaction.Create (Nil); IbSQL := TIBSQL.Create (Nil); IbTr.DefaultDatabase := IbDb; With IbDb Do Begin Params.Add ('user_name=SYSDBA'); Params.Add ('password=masterkey'); DatabaseName := DbName; LoginPrompt := False; SQLDialect := 3; DefaultTransaction := IbTr; Open; End; With IbSQL Do Begin Database := IBDb; Transaction := IBTr; End; End; Procedure TDB.Close; Begin IbDb.Close; FreeAndNil (IbSQL); FreeAndNil (IbTr); FreeAndNil (IbSQL); End; Procedure TDB.Save (ITemp, IPotm : Integer); Begin IbTr.StartTransaction; IbSQL.SQL.Text := Format ('insert into LOG (TEMP, POTM) values (%s, %s)', [IntToStr (ITemp), IntToStr (IPotm)]); IbSQL.ExecQuery; IbTr.Commit; End; (* ============ End ======================== *) End.
5.Relational Database management System (RDBMS). Installing Interbase / Firebird provides for the database engine, not for the database to be used. The database can be created using the
console application ibsql.
To automate this process and to be able to recreate the database when errors are found or when installing on a different machine all statements are included in the script file create_db.sql. This script is executed by the command : ibsql -q -i create_db.sql. The script contains comments, the creation of the database, creation of tables, creation of triggers and generators to implement auto-increment fields and sample data to populate the tables.
Marshalling
(In computer science, marshalling (similar to serialization) is the process of transforming the memory representation of an object to a data format suitable for storage or transmission.) It is typically used when data must be moved between different parts of a computer program or from one program to another. Marshalling is a process that is used to communicate to remote objects with an object(serialized object - serialization). It simplifies complex communication, using custom/complex objects to communicate - instead of primitives. The opposite, or reverse, of marshalling is called unmarshalling (or demarshalling, similar to deserialization). - wiki
Atomicity
In database systems, atomicity (or atomicness) is one of the ACID transaction properties. In an atomic transaction, a series of database operations either all occur, or nothing occurs. A guarantee of atomicity prevents updates to the database occurring only partially, which can cause greater problems than rejecting the whole series outright. The etymology of the phrase originates in the Classical Greek concept of a fundamental and indivisible component; see atom. An example of atomicity is ordering an airline ticket where two actions are required: payment, and a seat reservation. The potential passenger must either: 1. both pay for and reserve a seat; OR 2. neither pay for nor reserve a seat. The booking system does not consider it acceptable for a customer to pay for a ticket without securing the seat, nor to reserve the seat without payment succeeding. - wiki
Page 41
GR
T NE EA AT NE
E
GR
(Advertentie)
The DelphiController and DelphiDevBoard were designed to help students, Pascal programmers and electronic engineers understand how to program micro controllers and embedded systems especially in programming these devices. This is achieved by providing hardware (either pre-assembled or as a DIY kit of components), using course material, templates, and a Pascal compatible cross-compiler and using of a standard IDE for development and debugging (Delphi, Lazarus or FreePascal)
Information about sales etc. www.blaisepascal.eu
M324D40
Delphi CPU
Hardware. M324D40 DelphiCPU A programmed AtMega324 40 pin dual in line controller chip measuring 50 x 17 x 4 mm. This chip contains all basic computer parts like : - 32 kBytes Flash memory for program storage - 2 kByte RAM for variables - 32 general purpose registers - 1 kByte of EeRom for non - volatile storage - a AVR CPU - a hardware multiplier for fast mathematics - a clock oscillator able to run up to 20 MHz. The chip also contains a set of input and output peripherals like : - JTAG interface for real-time debugging - 3 timer counters with interrupts for a system clock and delays - 6 PWM channels which can be used for motor speed and direction control - 8 analog to digital converters 10 Bit with 1x, 10x and 200x amplifiers for interfacing with analog sensors, voltages and potentiometers - I2C interface for expanding the number of peripherals - Watchdog timer for automatic reset when a failure is detected - Analog comparator for accurate level detection - 32 programmable digital in- and output lines for lamps, switches, LCD, TCP/IP etc. 4 KByte of the 32 KByte flash is reserverved and preprogrammed with the: - OS : an operating system implementing a system clock, starting of servers and user applications - BIOS : I/O drivers for EeRom and standard hardware - Hardware test : standard hardware test application - M485A server : monitor for RAM and EeRom including a Flash programmer for user applications - I/O server : synchronized I/O with RAM records
1kByte EeRom 2kByte Ram 32kByte Flash BIOS Hardware test M485AServer I/OServer OS
3xTimers
VE08201 DelphiStamp this is a miniature (52 x 20 x 20 mm) plug-in version of the DelphiController as described above with additional resources : - Flash memory = 128 kByte - RAM memory = 4 kByte - EeRom memory = 4 kByte - xtal oscillator = 11.06 MHz
32 I/O wires
JTAG
WDOG
This 90 x 80 x 18 mm printed circuit board contains the VE09206 DelphiController plus additional sensors and actuators, enabling you to make a quick start in writing applications.. For the provided I/O the BIOS contains the required drivers. Templates are provided for The VE09206 DelphiController simulation in the Delphi IDE making the R&D cycle much quicker. This is a high quality printed circuit board with dimensions of 67 x 51 x 13 mm which contains besides the above described DelphiCPU : On board sensors are: - a potentiometer providing an angle measurement - a 5 to 9 Volt DC regulated power supply with screwd terminals - two push buttons and polarity protection providing the 3.3 Volt board voltage - a temperature sensor with a resolution of 0.06 degree Celsius On - a 7.37 MHz crystal oscillator board actuators are: - a LED as activity indicator - 10 segment LED bar - a reset switch - 2 digit 7 segment LED display - 3 enable / disable switches for servers and user application Communication: - JTAG connector for real-time debugging - RS232 or RS485 communication port - a RS232 port for monitoring and uploading user applications - 10 pin header with 6 universal I/O wires to control user hardware - a 40 pin socket to connect the controller with the application (in future: TCP/IP networking will be possible through this header) hardware
Page 42
Interface application. In this mode the PC is the master and executes the control algorithm while the DelphiController acts as an interface slave and follows the commands coming from the PC through its RS232 port. When the PC stops the DelphiController will also stop. In the interface mode the DelphiController contains in RAM an input and output record. Each field in these records reflects the state of the sensors and actuators. The IOServer running in the background synchronizes these records with the physical hardware through the BIOS drivers. The running M485A monitor server will accept read and write commands on the input and output records through the RS232 port. To make communication between the PC and the DelphiController easy a communication DLL, templates and sample applications are provided.
Comm DLL
Documentation. The tutorial package contains: - software users manual including install instructions - hardware users manual - description of hardware test method - several detailed project descriptions - compiler manual - data sheets of used components - electronic circuit diagrams - source listing of used drivers Cross-compiler. The cross compiler accepts pascal code as used in Delphi and Lazarus (no classes or objects) and converts this to AVR object code which the DelphiController will run after uploading. The cross compiler has a build-in assembler which can be used for drivers and time critical sections of the user application. IDE. The use of an IDE (Integrated Development Environment) greatly enhances the development process by providing features such as code completion, help, file management etc. Suitable IDEs are provided by Delphi, Lazarus and FreePascal. While using the IDE and suitable templates the controller algorithm can be run in simulation mode in which the provided GUI units and templates simulate sensors and actuators. After debugging, the unmodified source code can be used for the cross compiler. Templates. To help you learn quickly how to program this hardware, curve templates are providedin units for typical unvarying code sections found in simple projects. By supplying templates and library units the controller's behaviour can be simulated in your chosen IDE. Templates are provided for three types of application programs i.e. 1) Interface application 2) GUI application and 3) Stand alone application. Stand alone application. In this mode the full speed of the DelphiController can be utilized as no background servers are required. Optionally the M485A monitor server can be started for remote control through the RS232 port. Comm DLL
Controller
DelphiDevBoard
User hardware
GUI application. In this mode the DelphiController is the master and executes the control algorithm while the PC retrieves information from the RS232 port to update a GUI or to send commands to the controller. When the PC stops the DelphiController will continue. In the GUI mode the DelphiController contains in RAM an input and output record. Each field in these records reflects the state of the sensors and actuators. The IOServer running in the background synchronizes these records with the physical hardware through the BIOS drivers. The running M485A monitor server will accept read and write commands on the input and output records through the RS232 port. To make communication between the PC and the DelphiController easy a communication DLL, templates and sample applications are provided.
Comm DLL
GUI
User hardware
Information about sales etc. About the author Anton J. Vogelaar www.blaisepascal.eu is an electronic measurement and control engineer. In 1974 he completed his electronic engineering study at the HTS in Utrecht. In 1982 he completed a course in engineering science at the university of Durham UK. Since 1972 he has been director of Vogelaar Electronics Netherlands which specialises in in digital control equipment (air gauging, climate control, industrial control, bio-reactor control etc.). He also writes software which provides the control, GUI and persistence aspects required by the hardware his firm produces. The programming languages he uses used include Assembler (AVR, PIC and 8751), Pascal, Delphi and Java on Windows and Linux (embedded) platforms. He has more than ten years experience as part-time teacher in advanced technical colleges and gives lectures in Pascal/Delphi. His other interests are : rowing, steam engines, study, technology and grandchildren. DATABASE SPECIAL 2010 BLAISE PASCAL Page MAGAZINE 43
Controller
GUI (optional)
DelphiDevBoard
Written in Pascal
User hardware
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
25 megabytes Adobe PDF Instruction files Lazarus Help functionality is all pre-installed - no internet connection is required. The USB stick includes a wealth of documentation in Adobe PDF format - only for subscibers 25,00 plus 7,50 postage costs non subscribers 35,00 plus 7,50 postage costs Ordering at the Blaise
Object-Oriented Databases,
starter expert Delphi 2010
Detlef Overbeek
To begin with I think it is necessary to give you an overview of what a database actually is.. So first of all we will explain the most relevant models to be able to dive deeper and finally end up with a real surprise: Object oriented databases arent dead as I thought they were (because of all the negative reactions I had during the writing of this article about the item). To the contrary: they are thriving and I found some very large companies using them. What makes it extra exciting: I found one object database that works with Delphi and or Pascal and you can find some of the Pascal code on the .iso image of the DVD available at the Blaise website. (If you want to, you could just order the DVD as well) So lets try to understand the various database models and start wit an overview of the database models. The Hierarchical Model A hierarchical data model is a data model in which the data is organized into a tree-like structure. The structure allows repeating information using parent/child relationships: each parent can have many children but each child only has one parent. All attributes of a specific record are listed under an entity type. In a database, an entity type is the equivalent of a table; each individual record is represented as a row and an attribute as a column. Entity types are related to each other using 1: N mapping, also known as one-to-many relationships. The most recognized and used hierarchical databases are IMS developed by IBM and Windows Registry by Microsoft. - wikipedia
(The explanations that follow are taken from the websit at http://www.unixspace.com/context/databases.html.
An owner record type can also be a member or owner in another set. The data model is a simple network, and link and intersection record types (called junction records by IDMS) may exist, as well as sets between them . Thus, the complete network of relationships is represented by several pairwise sets; in each set some (one) record type is owner (at the tail of the network arrow) and one or more record types are members (at the head of the relationship arrow). Usually, a set defines a 1:M relationship, although 1:1 is permitted. The CODASYL network model is based on mathematical set theory.
http://en.wikipedia.org/wiki/CODASYL Preventive Maintenace
Rigid Pavement
Flexible Pavement
Spall Rpair
Joint Seal
Crack Seal
Patching
Sillicone Sealant
Asphalt Sealant
Figure 2: a network model Relational Model (RDBMS - relational database management system) A database based on the relational model developed by E.F. Codd. A relational database allows the definition of data structures, storage and retrieval operations and integrity constraints. In such a database the data and relations between them are organised in tables. A table is a collection of records and each record in a table contains the same fields. Properties of Relational Tables: * Values Are Atomic * Each Row is Unique * Column Values Are of the Same Kind * The Sequence of Columns is Insignificant * The Sequence of Rows is Insignificant * Each Column Has a Unique Name Certain fields may be designated as keys, which means that searches for specific values of that field will use indexing to speed them up. Where fields in two different tables take values from the same set, a join operation can be performed to select related records in the two tables by matching values in those fields. Often, but not always, the fields will have the same name in both tables. For example, an "orders" table might contain (customer-ID, productcode) pairs and a "products" table might contain (product-code, price) pairs. To calculate a given customer's bill you would sum the prices of all products ordered by that customer by joining on the product-code fields of the two tables. This can be extended to joining multiple tables on multiple fields. Because these relationships are only specified at retrieval time, relational databases are classed as dynamic database management systems. The RELATIONAL database model is based on Relational Algebra.
the Author is: Alexander Lashenko Toronto, Canada The hierarchical data model organizes data in a tree structure. There is a hierarchy of parent and child data segments. This structure implies that a record can have repeating information, generally in the child data segments. Data is stored in a series of records, which have a set of field values attached to them. The hierarchy collects all the instances of a specific record together as a record type. These record types are the equivalent of tables in the relational model, with the individual records being the equivalent of rows. To create links between these record types, the hierarchical model uses Parent Child Relationships. There is a 1:N mapping between record types. This is done by using trees, like set theory used in the relational model, "borrowed" from maths. Figure 1: a hierarchical model
Pavement Improvement
Reconstruction
Maintenance
Rehabilitation
Routine
Corrective
Preventive
Network Model The network model is a database model conceived as a flexible way of representing objects and their relationships. Its distinguishing feature is that the schema, viewed as a graph in which object types are nodes and relationship types are arcs, is not restricted to being a hierarchy or lattice. The network model's original inventor was Charles Bachman. - wikipedia The popularity of the network data model coincided with the popularity of the hierarchical data model. Some data is more naturally modelled with more than one parent per child. So, the network model permitted the modeling of many-to-many relationships in data. In 1971, the Conference on Data Systems Languages (CODASYL) formally defined the network model. The basic data modelling construct in the network model is the set. A set consists of an owner record type, a set name, and a member record type. A member record type can have that role in more than one set, hence the multiparent concept is supported.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
Page 45
Object-oriented programming has roots that can be traced back to the 1960s. As hardware and software became increasingly complex, manageability often became a concern. Researchers studied ways to maintain software quality and developed object-oriented programming in part to address common problems by strongly emphasizing discrete, reusable units of programming logic. The technology focuses on data rather than processes, with programs composed of self-sufficient modules ("classes"), each instance of which ("objects") contains all the information needed to manipulate its own data structure ("members"). This is in contrast to the existing modular programming that had been dominant for many years that focused on the function of a module, rather than specifically the data, but equally provided for code reuse, and self-sufficient reusable units of programming logic, enabling collaboration through the use of linked modules (subroutines). This more conventional approach, which still persists, tends to consider data and behavior separately.
A collection of interacting objects
An object-oriented program may thus be viewed as a collection of interacting objects, as opposed to the conventional model, in which a program is seen as a list of tasks (subroutines) to perform. In OOP, each object is capable of receiving messages, processing data, and sending messages to other objects. Each object can be viewed as an independent 'machine' with a distinct role or responsibility. The actions (or "methods") on these objects are Figure 4: example of an Object-Oriented Database Model. closely associated with the object. Object/relational database management systems add new object storage For example, OOP data structures tend to 'carry their own operators capabilities to the relational systems at the core of modern information around with them' (or at least "inherit" them from a similar object or class). In the conventional model, the data and operations on the data systems. These new facilities integrate management of traditional field don't have a tight, formal association. data, complex objects such as time-series and geospatial data and diverse binary media such as audio, video, images, and applets. By SQL data definition and data manipulation statement encapsulating methods with data structures, an ORDBMS server can Relational (create, insert and select) execute comple x analytical and data manipulation operations to search data syntax and transform multimedia and other complex objects. As an evolutionary technology, the object/relational (OR) approach has inherited the robust transaction- and performance-management features of its relational ancestor and the flexibility of its objectoriented cousin. Database designers can work with familiar tabular structures and data definition languages (DDLs) while assimilating new object-management possibilities. Query and procedural languages and call interfaces in ORDBMSs are familiar: SQL3, vendor procedural languages, and ODBC, JDBC, and proprietary call interfaces are all extensions of RDBMS languages and interfaces. And the leading vendors are, of course, quite well known: IBM, Informix, and Oracle.
Complex data types
Operators for complex data types object identifiers inheritance, polymorphism
Object DBMSs add database functionality to object programming languages. They bring much more than persistent storage of programming language objects. Object DBMSs extend the semantics of the C++, Smalltalk and Java object programming languages to provide full-featured database programming capability, while retaining native language compatibility. A major benefit of this approach is the The Object-Oriented Model Object-oriented programming (OOP) is a programming paradigm that uses "objects" unification of the application and database development into a seamless data model and language environment. data structures consisting of data fields and methods together with their As a result, applications require less code, use more natural data interactions to design applications and computer programs. Programming techniques may include features such as data abstraction, encapsulation, modularity, modelling, and codebases are easier to maintain. polymorphism, and inheritance. Many modern programming languages now support Object developers can write complete database applications with a modest amount of additional effort. OOP. - wikipedia An object is a discrete bundle of functions and procedures, often relating to a particular real-world concept such as a bank account holder or hockey player. Other pieces of software can access the object only by calling its functions and procedures that have been allowed to be called by outsiders. A large number of software engineers agree that isolating objects in this way makes their software easier to manage and keep track of. However, a significant number of engineers feel the reverse may be true: that software becomes more complex to maintain and document, or even to engineer from the start. According to Rao (1994), "The object-oriented database (OODB) paradigm is the combination of object-oriented programming language (OOPL) systems and persistent systems. The power of the OODB comes from the seamless treatment of both persistent data, as found in databases, and transient data, as found in executing programs." In contrast to a relational DBMS where a complex data structure must be flattened out to fit into tables or joined together from those tables to form the in-memory structure, object DBMSs have no performance overhead to store or retrieve a web or hierarchy of interrelated objects.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
Page 46
COMPONENTS
DEVELOPERS
First, there are data sources such as the Web, which we would like to treat as databases but which cannot be constrained by a schema. Second, it may be desirable to have an extremely flexible format for data exchange between disparate databases. Third, even when dealing with structured data, it may be helpful to view it as semi-structured for the purposes of browsing. Associative Model The associative model of data is an alternative data model for database systems. Other data models, such as the relational model and the object data model, are record-based. These models involve encompassing attributes about a thing, such as a car, in a record structure. Such attributes might be registration, colour, make, model, etc. In the associative model, everything which has discrete independent existence is modelled as an entity, and relationships between them are modelled as associations. The granularity at which data is represented is similar to schemes presented by Chen (Entity-relationship model); Bracchi, Paolini and Pelagatti (Binary Relations); and Senko (The Entity Set Model). A number of claims made about the model by Simon Williams, in his book The Associative Model of Data, distinguish the associative model from more traditional models. - wikipedia The associative model divides the real-world things about which data is to be recorded into two sorts: Entities are things that have discrete, independent existence. An entitys existence does not depend on anything else. Associations are relationships whose existence depends on one or more other things, such that if any of those things ceases to exist, then the association itself ceases to exist or becomes meaningless. An associative database comprises two data structures: 1. A set of items, each of which has a unique identifier, a name and a type. 2. A set of links, each of which has a unique identifier, together with the unique identifiers of three other things, that represent the source, verb and target of a fact that is recorded about the source in the database. Each of the three things identified by the source, verb and target may be either a link or an item.
SQL 2 1992
SQL
Figure 6: an overview of the history Semi-structured Model The semi-structured model is a database model. In this model, there is no separation between the data and the schema, and the amount of structure used depends on the purpose. The advantages of this model are the following: * It can represent the information of data sources which cannot be constrained by a schema. * It provides a flexible format for data exchange between different types of databases. * It can be helpful to view structured data as semi-structured (for browsing purposes). * The schema can easily be changed. * The data transfer format may be portable. The primary trade-off being made in using a semi-structured database model is that queries cannot be made as efficiently as in a more constrained structure, such as in the relational model. Typically the records in a semi-structured database are stored with unique IDs that are referenced with pointers to their location on disk. This makes navigational or path-based queries quite efficient, but for doing searches over many records (as is typical in SQL), it is not as efficient because it has to seek around the disk following pointers. The Object Exchange Model (OEM) is one standard to express semi-structured data, another way is XML. - wikipedia In semi-structured data model, the information that is normally associated with a schema is contained within the data, which is sometimes called ``self-describing''. In such a database there is no clear separation between the data and the schema, and the degree to which it is structured depends on the application. In some forms of semi-structured data there is no separate schema, in others it exists but only places loose constraints on the data. Semistructured data is naturally modelled in terms of graphs which contain labels which give semantics to its underlying structure. Such databases subsume the modelling power of recent extensions of flat relational databases, to nested databases which allow the nesting (or encapsulation) of entities, and to object databases which, in addition, allow cyclic references between objects. Semi-structured data has recently emerged as an important topic of study for a variety of reasons.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
COMPONENTS
DEVELOPERS
Page 47
All pointers that belong to a particular static pointer type point to the same Class (albeit, possibly, to different Object). In this case, the Class name is an integral part of the that pointer type. A dynamic pointer type describes pointers that may refer to different Classes. The Class, which may be linked through a pointer, can reside on the same or any other computer on the local area network. There is no hierarchy between Classes and the pointer can link to any Class, including its own. In contrast to pure object-oriented databases, context databases is not so coupled to the programming language and doesn't support methods directly. Instead, method invocation is partially supported through the concept of VIRTUAL fields. A VIRTUAL field is like a regular field: it can be read or written into. However, this field is not physically stored in the database, and in it does not have a type described in the scheme. A read operation on a virtual field is intercepted by the DBMS, which invokes a method associated with the field and the result produced by that method is returned. If no method is defined for the virtual field, the field will be blank. The METHODS is a subroutine written in C++ by an application programmer. Similarly, a write operation on a virtual field invokes an appropriate method, which can changes the value of the field. The current value of virtual fields is maintained by a run-time process; it is not preserved between sessions. In object-oriented terms, virtual fields represent just two public methods: reading and writing. Experience shows, however, that this is often enough in practical applications. From the DBMS point of view, virtual fields provide transparent interface to such methods via an aplication written by application programer. A context database that does not have composite or pointer fields and property is essentially RELATIONAL. With static composite and pointer fields, context database become OBJECT-ORIENTED. If the context database has only Property in this case it is an ENTITYATTRIBUTE-VALUE database. With dynamic composite fields, a context database becomes what is now known as a SEMISTRUCTURED database. If the database has all available types... in this case it is ConteXt database! For more information see: Concepts of the ConteXt database The current version of DBMS ConteXt you can download from: UnixSpace Download Center:
http://www.unixspace.com/download/index.html
Something special: The Context Model The context data model combines features of all the above models. It can be considered as a collection of object-oriented, network and semistructured models or as some kind of object database. In other words this is a flexible model, you can use any type of database structure depending on task. Such a data model has been implemented in the ConteXt DBMS.. The fundamental unit of information storage of ConteXt is a CLASS. A class contains METHODS and describes an OBJECT. The Object contains FIELDS and PROPERTY. The field may be composite, in which case the field contains SubFields etc. The Property is a set of fields that belongs to a particular Object. (similar to an AVL database). In other words, fields are the permanent part of an Object whereas Property is its variable part. The header of a class contains the definition of the internal structure of the Object, which includes the description of each field, such as its type, length, attributes and name. A Context data model has a set of predefined types as well as user defined types. The predefined types include not only character strings, texts and digits but also pointers (references) and aggregate types (structures). A context model comprises three main data types: REGULAR, VIRTUAL and REFERENCE. A regular (local) field can be ATOMIC or COMPOSITE. The atomic field has no inner structure. In contrast, a composite field may have a complex structure, and its type is described in the header of Class. The composite fields are divided into STATIC and DYNAMIC. The type of a static composite field is stored in the header and is permanent. A description of the type of a dynamic composite field is stored within the Object and can vary from Object to Object. Like a NETWORK database, apart from the fields directly containing the information, context database has fields storing a place where this information can be found, i.e. a POINTER (link, reference) which can point to an Object in this or another Class. Because the main unit addressed in a context database is an Object, the pointer addresses an Object instead of a field of this Object. The pointers are either STATIC or DYNAMIC.
During this summer I interviewed many people about the subject of Object Oriented Databases. One of the most interesting and informative suppliers I found was InterSystems. Its a firm that operates worldwide with an office here in Belgium and a subsidiary in Amsterdam. They have a great OODB called CACH. They were very helpful and I had a long talk with one of their specialists. I actually learned quite a lot about OODBs, and the current real-world market for OODBs. Object Oriented Databases have found a very secure niche in database land. One of the things I learned, was that Intersystems had added SQL to the OODB so offering all normal database functionality. I was told about their special client: The Belgian Police. A tragic crime in 1996, and its aftermath, resulted in sweeping reforms to Belgiums police forces. New laws enacted in 1998 called for the consolidation of Belgiums 196 independent municipal police zones (using a patchwork of information systems), the federal police (with its own information systems), and the Criminal Investigation Department, into a single integrated force with two levels: federal and local police.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
Page 48
COMPONENTS
DEVELOPERS
A) Classes are modular. Programmers can improve the internal workings of classes without affecting the rest of the application at all. B) Classes are interoperable. Classes can be shared between applications, because the interface (the properties and methods) remains constant. Polymorphism refers to the fact that methods used in multiple classes can share a common interface, even if the underlying implementation is different. For example, say an application uses several different classes: Letter, Mailing Label, and ID Badge, all of which contain a method called PrintAddress. The application doesn't need to contain special instructions about formatting an address for each kind of object. It merely includes a command that says something like "DO PrintAddress(objectID)". Polymorphism ensures that each object carries out the instruction in a manner appropriate for the class to which that object belongs.
The Cach Objects Model is like The Real World
Cach object technology attempts to describe the way that humans actually think about and use data. It does this by bundling together data and the code that controls how the data is used. In object parlance, the various pieces of data contained in a class are called "properties" and the sections of code that describe how the data behaves are called methods. Cach object technology also promotes a naturalistic view of data by not restricting properties to simple, computer-centric data types. Classes may contain other classes, or references to other classes, which makes it easy to build useful and meaningful data models. Here's a simple example: MAMMAL PET
Multiple Inherritance
DOG
Figure 10: Subclasses can inherit attributes from one or more superclasses
COMPONENTS
DEVELOPERS
Page 49
CUSTOMER
Name SSN
Account Rep: Account Rep is a fairly complex object that exists independently from Customer. In this example, Customer includes a reference to the appropriate Account Rep.
Figure 11: even though Customer contains a large amount of information, an application can treat it as single entity - an object.
Invoice: References can be made to more than one instance of a class, thus creating a collection. A collection can be thought of as a one-to-many relationship. Cach also supports other types of relationships
Creating Objects
Classes are rapidly created and edited with the Cach Studio. The Studio is an integrated development environment (IDE) where developers can perform all of their application development tasks. For data modelling, this includes specifying properties, coding and debugging object methods, and defining specialized data types. The support for advanced object concepts, simple and multiple inheritance, embedded objects, references to objects, collections, relationships, and polymorphism - make the Studio a powerful and productive environment for modelling data and business processes.
Importing/Exporting Data Models
By virtue of Cach's Unified Data Architecture, all its classes are automatically accessible as relational tables via ODBC and JDBC. And by taking advantage of inheritance, Cach classes can easily be adapted for use with XML and object-oriented technologies.
Cach Server Pages
A class designated as a Cach Server Page automatically inherits all the necessary Web session management methods, plus the "OnPage()" method where developers can code the page content.
XML
The Studio includes a wizard for the easy creation of Cach classes, but there are several other ways to input and export class definitions to and from the Studio. The Cach RoseLink allows classes defined using Rational Software's popular Rose object modeling tool to be imported into Cach. Similarly, class definitions can be exported to Rose for use within Rose's modelling environment. Cach can also create objects from relational DDL files. The resulting object classes will be very simple: their properties will be single-valued system-defined data types that correspond to the relational table's fields, and their only methods will be those persistence methods required to move the data to and from the disk. However, thanks to Cach's Unified Data Architecture, even these simple classes are immediately available for use with object programming languages, and they may be used as building blocks to create more complex data models. XML provides another way to transport class definitions from one application to another. Class definitions can be exported/imported as XML documents.
Scripting Languages
Inheriting properties and methods from the %XML.Adaptor class (provided by InterSystems) enables a class to import/export XML data. Cach will automatically determine the mapping between Cach objects and XML documents, or developers may create custom maps.
COM
A single command within the Cach Studio projects Cach classes as COM classes for use with tools such as Visual Basic, Delphi, and any software compatible with the COM interface. Cach also includes a COM Gateway, which allows COM objects to be used by Cach applications.
C++
One command can project Cach classes as Java classes. Cach also provides a class library that allows Java programmers to access Cach objects in the Cach database.
EJBs
EJB projections can also be created with one click from within Cach Studio. Cach allows developers to take advantage of the speed of BeanManaged Persistence, without having to do lots of tedious coding to map between Java classes and relational tables. Cach supports BEA's WebLogic application server.
Methods in Cach objects are coded using either (or both) Cach ObjectScript or Cach Basic. Both languages allow developers to use all of Cach's data access modes - Objects, SQL, and Multidimensional within the same routine.
Page 50
COMPONENTS
DEVELOPERS
If data has to be presented in a clear manner, people frequently use a report. A report presents the data in a predefined format. This can be done using a table, a chart, a graphic or plain text. The data can come from several sources, such as a database, the result of a (complex) calculation, data which have been typed earlier by a user and stored in a file or even the output of measuring equipment.
It is possible to program the report layout entirely from code. The programmer calculates and codes the exact position of each report element. This is however not a flexible solution. Imagine an end-user wants to have an extra graphic or table added somewhere in the report, or even the same data in another order. Then the programmer has to recalculate and recode the entire report layout, check and discuss this new layout with the user, etc. It would be much easier if the end-user could make (small) changes to the report layout by himself, even without having any programming skills. A solution for having this kind of flexibility, is to use a report generator. A report generator generates a report by using the report blue print and the data. This resulting report can be viewed, printed or even exported to a format you need (figure 1).
Report Builder: + Fully integrated with the Delphi IDE + Visually create a report layout + 21 different components to present the data + Speed + Runtime Pascal Environment (RAP). Object Pascal with event handling integrated to create complex reports + Good documentation + Source code is included in all editions - RAP is only available with the Enterprise and Server editions - Relatively expensive. The version with the end-user layout editor integrated (Professional edition) costs $ 495. - No .NET version available Crystal report: + Has a lot of features + Can display data from many different databases - Integration with Delphi can be a problem - Separate licenses needed for commercial applications - Expensive, license fees start at 479,Rave reports: + Free. Rave BE (Bundled Edition) is installed automatically when installing Delphi 7, 8, 2005, 2006, 2007 and 2009, 2010 and XE. + Fully integrated with the Delphi IDE - People say that there are a lot of problems with the Delphi 2009 edition. - Very hard to make contact or get support. - Website hasn't been updated for a long time and there is no information about the versions they sell. - No .NET version available Quick Report: + Quick Report (version 5.05) is available for Delphi 5/6/7/2005/2006/2007/2009/ 2010/Xe (Win32 mode) 5.05 for Delphi XE 32 now available for download. They are working on the C++ Builder XE version. + Fully integrated with the Delphi IDE +/- Pro version 345,+ Upgrade price is 25% of a new license. + end-user report editor and PDF export (Pro version) - It seems that Quick reports contains a couple of nasty bugs - No .NET version
Figure 1: Internals of a report generator There are a couple of commercially available report generators for Delphi. The most well-known are: Report Builder, Crystal Report, Rave Reports, Quick Report and Fast Reports. We ourselves were searching for a report generator to be used in a couple of existing and future applications. These applications are (or will be) written in Delphi 7, 2006 and C#. These applications use the same databases (MS-SQL) and several other data sources (XML,CSV and EDF/EDF+). A couple of requirements we had for choosing a report generator were: * Suitable for both Delphi and .NET * Integrated end-user layout editor * Export to different formats (e.g. PDF, JPG, TIFF) with preview * Source available to adapt/extend functionality and being independent of the supplier * Possibility to have web enabled reports in the future * Not too expensive (well, that's business as usual in healthcare ) With these requirements in mind we have looked at the report generators and listed our strong and weak points of each of the report generators:
Fastreport :
+ Fully integrated with the Delphi IDE Improved Engine: improved shift mechanism / duplicated combining / new aggregates improved cross object / changes in xml format (write collections in XML) improved report inheritance / hierarchy / watermarks/ objects fill improved linear barcodes / improved interactive reports OnMouseEnter /OnMouseLeave events detailed reports /multi-tab preview for detailed reports; + Well documented + Create a layout visually from the Delphi IDE or Visual Studio + A lot of different components to present the data New Objects: new 2D barcodes: DataMatrix and PDF417; Table object / Cellular text / Zip Code; + A lot of export filters: PDF, RTF, XLS, XML, HTML, JPG, BMP, GIF, TIFF, TXT, CSV, Open Document Format (you can even build our own export filters!) New exports: BIFF XLS / PPTX / XLSX / DOCX + Built in script engine for PascalScript, C++Script, BasicScript, JScript with debugger (Win32 version). The .NET version currently uses C# and VB.NET for scripting. + Versions available for Delphi 4 XE and .NET (integrated with Visual Studio: Delphi Prism, C#, VB.NET, etc) + Built in end user rapport editor, starting at standard edition, without any extra license fees + Source available (starting at Professional Edition) + Web reports (Enterprise edition) + Licenses starting at $79 (Basic) till $349 (Enterprise)
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
Page 52
Figure 3: Report editor pages As a small excercise we will display data from a demo database. To create a report of data from a database we have to add the database to the Data page. We can do that by dragging the database component onto the Data page and double-click its icon. This will start the Connection Wizard. You can also connect to a database manually by setting the components properties. For this exercise we will use a BDETable component en set the DatabaseName property to DBDEMOS (a Delphi demo database) and the TableName property to clients.dbf . The database is now ready for use.
Page 53
Listing 1: Adding report variables from code You can add a 'Master' band without database to your report by adding a the 'Master' band and when the 'Select DataSet' dialog appears, select '[unassigned]'. Do not forget to set the number of records to 1. If the number of records is set to 0, Fastreport will not display the band, in fact hiding it. The way data is presented in the report, for example 2 decimals or dd-MMM-yyyy as a date format can be set by double clicking the 'Text Object' , select the 'Format' tab and set the desired display format. Creating a dynamic report with code You can not only show static data form databases and variables. It is also possible to add pieces of code to your report. You can for instance add code to calculate a new value from the other report variables, e.g. convert a number of seconds to a string containing the equivalent hours and minutes. Fastreport VCL uses Pascal (script) code. Fastreport .NET uses C# or VB.NET (the current .NET version 1.1 can only use these two languages, even when you are using Delphi Prism!). You can imagine you can create very sophisticated reports using script code. Every component you add to the report can have event handlers attached to it. For instance, there is a 'OnBeforePrint' print event which will be called just before the component is rendered to the internal report canvas. You can use these event handlers to do some special processing. For example: you want to hide a 'Memo' component if a report variable has a predefined value. In the 'OnBeforePrint' handler you can check this value and optionally hide the Memo, by setting it's 'Visible' property to False. Adding your own functions With Fastreport you can report simple variables easily, but it is also possible to create complex expressions using one or more of the available mathematical, string and other functions. Even if the function you require is not available you can create and implement your own custom function. To use a custom function you have to implement the algorithm using Delphi and register it with Fastreport. We ourselves have variables that are expressed in seconds, but the report user asked to show the values as hours and minutes. Creating the function in Delphi was rather straight forward, also registering it with Fastreport (listing 2).
procedure TForm1.Button1Click(Sender: TObject); begin frxReport1.DesignReport; end; Listing 3 Activating the designer from
code
Figure 5: The newly added function You can find your newly added function in the tab 'Functions' You can now use this function by dragging the variable you want to be showed as hours and minutes to your report. Double click the component, a property editor will popup, and modify the variable name to the following expression: [SecToHHMM(<VariabeleNaam>)] This procedure of adding your own functions is only available in Fastreport VCL. Fastreport .NET uses a different approach, even more simple. First create an assembly with the functions you will need and add this assembly to the 'Assembly' property of the report. You will be able to use your own functions right away. You can also use the already available functions defined by the assemblies of the .NET framework.
Page 54
Figure 6: Inheritance Report inheritance A lot of companies have a default report layout with their name, address, bank account, logo, etc. Adding these same elements over and over again when you create a new report layout is a boring and errorprone task. The creators of Fastreport have found a solution for this problem: 'report inheritance'. You create the base report with the company info once and use this report as a base for all 'descendant' reports. You can set the base report in the 'Report Settings' dialog (figure 6). If you have to change, for instance, the address or bank account of the company, you only have to change the base report. All descendant reports will also be changed automatically. You can use all elements of the base report from your descendant report and change properties (e.g. font, size, color) without changing the base report. There are however some restrictions concerning inheriting reports: The base report can not contain any code. Fastreport will not warn you, but the inherited report will disregard all code from the base report. You cannot inherit from an inherited report. You cannot use the same component names in both the base and inherited reports. So if you don't give your components a unique name, there is a possibility that if you add new components to your base report, the inherited report will have components with same names and you won't be able to use your inherited report.
Creating or modifying a report layout from code It is also possible to modify, remove , add report elements or change their properties from inside your application. Listing 4 shows (a part of) code to add a 'Page Footer' to a report. This way of modifying components is equivalent to the way it can be done for a Delphi form.
procedure TPolymanAnalysisReport.AddDefaultPageFooter (APage: TfrxReportPage); var Memo: TfrxMemoView; PageFooter: TfrxPageFooter; S: String; begin PageFooter:= TfrxPageFooter.Create(APage); PageFooter.Name:= 'ProgramAndReportInformation'; Memo:= TfrxMemoView.Create(PageFooter); Memo.Name:= 'MemoProgramAndReportInfo'; Memo.Font.Size:= 8; Memo.StretchMode:= smActualHeight;
namespace FastReport { public class ReportScript { public string IIF(bool condition, string trueValue, string falseValue) { return condition ? trueValue : falseValue; } } }
Page 55
An overview of reports
Page 56
Page 57
Fastreport .NET Fastreport VCL The creators of Fastreport have succeeded to create a report generator that can be used in both Win32 and .NET applications. The VCL and .NET versions have a lot in common: same kind of editor, same kind of components, etc. But there are also a lot of differences. These differences are logical: both frameworks, the VCL and the .NET framework, on which the report generators are based are also quite different. The .NET version has had a complete makeover: new classes and a new modern editor look. We have used the VCL version for some time now, but we have to read the manual now and then to find out how things work in the .NET version. Lazarus LazReport is not compatible with FastReport yet, because LazReport is based on FreeReport (it is a very old version: FastReport- 2.3). For example - an actual format of files is used from FastReport - XML, but in the second version we have used the binary format. In a contact with Michael Philippenko from Fastreport, he told us that as soon we have developed the special trial component for all purposes (The Lazarus team and Blaise Pascal Magazine is working on that), they will consider building a version for Lazarus. Conclusion This article shows some of the possibilities of Fastreport, not all. We will build example applications and use them for publication in the next issue. But up till now we haven't been bumping into problems we could not solve; sometimes we had to consult the manual or search the support forum to find the solution. We stronly recommend Fastreport for its great quality, for its relatively low price, the enormous amount of features and expandability.
FastReport VCL 5
COMING SOON
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
Page 58
Figure 1. Note the ID field, which is the primary key of type autoinc. This table is saved as Registration.adt, and can now be used by the application we'll make with Delphi Prism. But before we continue, we must make sure that apart from the Advantage Database Server, we've also installed the Advantage .NET Data Provider (so we can use ADO.NET and ASP.NET with declarative data binding to connect to the Advantage database). Delphi Prism XE Delphi Prism XE is the most recent edition of Delphi Prism at the time of writing (people with a subscription received no less than two major updates in the last year: first from Delphi Prism 2010 to Delphi Prism 2011, and then a few weeks ago Delphi Prism XE). It can run in both Visual Studio 2008 and 2010, but for this article I'm using Delphi Prism XE in Visual Studio 2010, together with ADS v10 as mentioned before. Using Delphi Prism XE, we can create ASP.NET Web Projects, with File | New Project, using the dialog from the following screenshot:
COMPONENTS
DEVELOPERS
Page 59
Figure 2. For the purpose of this demo, I'll give the project the name EventRegistration. The ASP.NET project will consist of one page Default.aspx, where we should start by placing a FormView control from the Data category of the Toolbox. The FormView has a number of tasks, including one to Choose the Data Source. Since there is no Data Source on the page, yet, we should select the <New data source> option instead:
Figure 3. This will produce the Visual Studio Data Source Configuration Wizard, where we can specify where the application will get its data from. In our case, that's from a SQL Database, so click on the SQL Database item. This will automatically generate a default ID for the data source (SqlDataSource1) and place this ID in the textbox so we can modify it if needed. Click on OK to go to the next page of the wizard. In the second page, we can choose the data connection. Either from a list of existing connections, of by clicking on the New Connection button. If you click on the New Connection button, a dialog will pop-up in which we can choose the data source. Here, we can select the type of data source from a list that contains Advantage Database Server (if you've installed the Advantage .NET Data Provider), as well as for example DataSnap, InterBase, and several Microsoft drivers.
Figure 4.
Page 60
COMPONENTS
DEVELOPERS
Figure 5. If you click on the Continue button, a new dialog follows were we can specify the specific details to connect to the ADS Data Dictionary. Unless you've specified a username and password to access the Data Dictionary, this usually only means that we have to specify the location of the .add file.
Figure 6. Click on Test Connection to ensure that we can connect to the Data Dictionary. Click on OK if everything works, and back in the Configure Data Source wizard, we can click on OK to get to the next page were the option is offered to save the connection in the web.config file. This is handy, since it means we can modify the connectionstring without having to recompile the application.
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
COMPONENTS
DEVELOPERS
Page 61
Figure 7. The next page allows us to build the way we want to retrieve the data from the database. We can use a SQL statement or stored procedure, or use the dialog to select a table (there is only one: Registration) and specify the fields that we want to use in a SELECT statement for example. Note that by default the wizard will check the * for all fields, but I prefer to explicitly check the fields I need instead.
Figure 8.
Page 62
COMPONENTS
DEVELOPERS
Figure 9. This will ensure that we can use the FormView in INSERT mode to enter new registrations. After we close the dialog, we can click on OK to get to the last page of the wizard, and close that one as well. The result is that we now have a FormView with a newly configured SqlDataSource component that connects to the Registrations table from the Event Data Dictionary. ASP.NET Page We can now configure the ASP.NET page, and especially the FormView to show itself in INSERT mode only. This can be done by selecting the FormView, and in the Properties Inspector making sure DefaultMode is set to Insert, with the following result:
Figure 10. Now we can give the application a test run, by selecting Debug | Start Without Debugging, which will start the ASP.NET Development Server as well as the default browser, showing the registration application in action:
COMPONENTS
DEVELOPERS
Page 63
Figure 11. Obviously, there are some issues with this page. First of all, the ID field is of type autoinc, so that shouldn't be part of the input screen. And second, after we click on the Insert hyperlink, the page doesn't jump to a Thank you! page (something we didn't implement, yet), but gives an error message instead:
Figure 12. This problem is caused by the fact that the declarative data binding in the generated .aspx file is using positional parameters in the INSERT statements, but named parameters in the list of parameters that follows it. In detail, the InsertCommand is specified as follows:
InsertCommand = "INSERT INTO [Registration] ([FirstName], [LastName], [Address], [Postcode], [City], [Country], [Company], [Email], [Phone], [ADS], [ID]) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)"
We should change the ? in the INSERT statement to :fieldname items, turning it into the following (also removing the ID field):
InsertCommand
= "INSERT INTO [Registration] ([FirstName], [LastName], [Address], [Postcode], [City],[Country], [Company], [Email], [Phone], [ADS]) VALUES (:FirstName, :LastName, :Address, :Postcode, :City, :Country, :Company, :Email, :Phone, :ADS)"
COMPONENTS
DEVELOPERS
Page 64
We can implement this and catch INSERT errors in the Inserted event of the SqlDataSource. In the designer, select the SqlDataSource and then double-click on the Inserted event. Here, we can check if there were any INSERT errors (I leave it up to the reader to display them in a user friendly way), or to display the "Thank you!" message using a call to Response.Write. And that's it. Of course, the final registration page will need to look much better when deployed on the internet (including the use of validators to ensure certain fields are required), but the skeleton is in place. If you want to know more about the actual ADS Training Day in The Netherlands we want to use it for, and attend the session on November 3rd where I'll build this application live' (including the validators), then check out http://www.eBob42.com/ADS. Thanks in advance! Bob Swart
// handle error?
end else begin Response.Write('<h1>Thank you for your registration!</h1>'); end; end;
COMPONENTS
DEVELOPERS
Page 65
What is a datawarehouse? It's a large collection of data from various sources, organized in a way that makes it possible to search efficiently in order to extract the required data. In other words, it requires data to be organized in a way that makes it easy to access and search. True datawarehouse centres typically specify the minimum requirements data must meet before it is acceptable to their system.. The data must follow certain rules to be easily searchable, and that means in some situations that data must be reorganized to meet this requirement before it can be inserted into the datawarehouse. However situations exist where data, for various reasons, cant be reorganized and must be offered to a datawarehouse as is. For example where the data is used as evidence in a criminal case, or where the sheer volume of data in each package makes it impractical to reorganise. This article describes, how such a scenario has been handled using kbmMW, allowing users to search many terabytes of data (the total volume of data is constantly increasing) stored in thousands of separate databases (also constantly increasing). The situation is that a company receives large (gigabyte sized) databases in SQLite format every day from various places. The data, amongst other things, contains hundred of thousands of records with detailed information about how a technical instrument is operating, and in addition contains a lot of frame by frame video footage too. Each technical device may (typically will) produce many databases over time, each containing the results of a single production run. The company requires that no changes can be made to the structure of their database files. And they are not interested in duplicating information into a different database system for various reasons, one of them being security.
Another feature of kbmMW Enterprise Edition is its ability to communicate completely in async mode via its transport framework called WIB (Wide Information Bus). When operating with the WIB all clients and servers are just nodes. The servers are nodes which publish certain information and subscribe for requests, while the clients typically are nodes publishing requests and subscribing for responses. kbmMW v. 3.50.01 Enterprise Edition which is currently running in beta 2 and is about to be released any day now, contains features that allow a server to provide incremental responses to a clients request for data from a database. Its not a simple fetch-on-demand scenario we are talking about where the client asks for more and more information. It is a server side push of data records (virtual or real) that are relevant for the ongoing search, whenever data is available. Futher we also need some sort of connection management for accessing the thousands of databases. Preferably one where we can cache the connections to each database for later reuse. Hence we would like to end up with a client that simply makes a request for data. For example asking for temperature data for all productions from a specific technical instrument. That data can then be analyzed in various ways afterwards which is beyond the scope of this article. (In the specific company case, a scripted, vector-oriented integrated development environment was developed for them, making it easy to analyze massive amounts of data using methods similar to Matlab's, and under full control play synchronized video from the frames in the databases). The rest of this article shows how to build such an application server and client using kbmMW.
The application server We will first build a simple standard kbmMW application server with a Previously, they were used to picking one database file at a time, analysing it remotely over the net using a tool such as Matlab. However built-in query service for accessing databases. Afterwards we will the performance of such operations was terrible, due to the enormous modify it bit by bit to give us the functionality we need as described above. amount of data passing over the net. That lead to people trying to We start out with a new empty VCL forms appilcation which means the duplicate data into other local databases that were easier to handle, but application server will run as a regular application. In real life scenario's, that wasn't advisable for several reasons: you would create an application server as a service, which is illustrated - Security in one of the many samples available on the Confidential raw data was no longer contained as they would kbmMW/Downloads/Samples section of Components4Developers like it to be. home page. Since creating Windows service applications is beyond the - Security scope of this article, I just take a shortcut and create a simple Windows They allowed experts to update some sections of the database. However if duplicate versions of the data existed, then there was no application that operates as a server. We add the following: guarantee that the updates actually altered the original database. - TkbmMWServer component. Name it Server. Manual updating proved unreliable here.. - TkbmMWTCPIPIndyMessagingServerTransport component and - Performance Manually (or semi autimatically) copying data from name it Transport. Set its property Server to point to Server, and many database files to local storage, was time its ClusterID to 'Demo'. consuming, space consuming and error-prone. Thus a better solution was sought. And that's where kbmMW really - Two message queue components for the in and out bound messages for the transport. Name one of them qIn and the other shines. qOut. Set the properties Transport.InboundMessageQueue to As its probably well known, kbmMW is a middleware, which provides point at qIn, and Transport.OutboundMessageQueue to point at the glue between a server application (or in kbmMW terms.. application qOut. server) and one or more clients. It allows the clients to request various - We also add two buttons Listen and Don't listen to activate and operations on the application server, and expect responses for those deactivate the server. requests. The form should look something like this:
One of the many features provided by kbmMW is the ability to act as a middleman between a database and a client. By putting kbmMW in between the client and database, you gaine several advantages including: - Significantly reducing the network traffic required to access the database. - Significantly increasing the security level around the database. - Significantly increasing the responsiveness of the client.
Figure 1:
Page 66
COMPONENTS
DEVELOPERS
With messaging transports, correct subscriptions are crucial, otherwise the node (in this case the application server) will not receive the correct messages. As we will have a messaging based client later on, we need to tell the server that it should accept three type messages: Requests (REQ), Subscriptions (SUB) and Unsubscriptions (USB), the latter are not really required in this demo, but its good practice to pair up SUB and USB subscriptions. In all cases the application server subscribes only for requests etc. for the cluster called Demo (set in the Transports ClusterID property). Then lets tell the Transport that it should allow connections from anywhere. Open its Bindings property and at designtime add a new binding (Ip4).
Figure 5: The first page of the wizard is shown. Select the Query service/kbmMW_1.0 service type and click Next:
Figure 2: Set the port number to 3000 and the mask to 0.0.0.0.
Figure 6: Now we will be given the option to choose what type of database we would like to access. In this example we choose to use SQLite which is supported by all kbmMW Editions. Then click Next. Figure 3: Then we add the query service via the service wizard:
Figure 4: Locate the Components4Developers Wizards and clicj the kbmMW Service Wizard:
Figure 7: We will then be given the opportunity to name the service, and optionally give it a version. Versioning a service can be smart if, at some point, the service interface changes, and we want to support both older and newer clients. Lets name the service DATAWAREHOUSE and keep the version empty. Then click Next.
COMPONENTS
DEVELOPERS
Page 67
Figure 8: Now click through all the remaining wizard pages, and click the OK button on the last page.
Figure 9: This generates a new datamodule for us, with a couple of components on it: The data module will be used by clients making database requests to the application server. Because we want to access a large number of SQLite databases, we have not defined a SQLite connection pool (which in most applications would be an expected step) on the application servers main form (we deselected that option in the wizard). Instead we will make our own list of known SQLite databases, from which we will select one or more as needed. The next step is to register this service with the application server (Server component) on the main form. The main form's OnCreate event is a suitable place to do that, as it only needs to be registered once. Figure 10:
Page 68
COMPONENTS
DEVELOPERS
Then we make a couple of methods to fetch database connection pools from our hash list (our pool of connection pools), and automatically create new connection pools when new databases are accessed. We first add the unit kbmMWSQLite to the uses clause of the unit Unit1.pas. Then we add this method:
function TForm1.GetConnectionPool (ADatabaseName:string):TkbmMWSQLiteConnectionPool; begin DBs.BeginWrite; try Result:=TkbmMWSQLiteConnectionPool (DBs.GetObject(ADatabaseName)); if Result = nil then begin Result:=TkbmMWSQLiteConnectionPool.Create(nil); try Result.Database:='yourdbdirectory\ +ADatabaseName+'.db'; Result.Active:=true; DBs.AddManagedObject(ADatabaseName,Result); except FreeAndNil(Result); raise; end; end; finally DBs.EndWrite; end; end;
For simplicity in this example, we design the application server to know about databases with filenames DB1.db, DB2.db, DB3.db.. DBn.db. The client will tell us what database number(s) we are to search in, at the appropriate time. This sample assumes that there is an identically structured table called 'DATA' in each database. Each database file needs its own TkbmMWSQLiteConnectionPool, to take care of connections to the database, caching of resultsets and metadata and more. Thus when the client requires access to a specific database, we can choose to create a TkbmMWSQLiteConnectionPool on-the-fly, connect it to the relevant database and execute the client query request. Or we can choose a better performing method, where we keep a list of all previously accessed databases, and thus keep the databases open and ready for use. A pool of connection pools. We'll show a simplistic way to do that now. We define a thread safe hash table that will contain all the previously accessed database's connection pools. The unit kbmMWGlobal contains lots of nice containers and other goodies, so we add it to the uses clause.
uses Windows, Messages, SysUtils, Variants, Classes, Graphics, Controls, Forms, Dialogs, kbmMWCustomMessagingTransport, kbmMWCustomTransport, kbmMWServer, kbmMWCustomServerMessagingTransport, StdCtrls, kbmMWGlobal;
The method first ensures that all access to the contents of DBs is protected so only one thread at a time can access it. Then we look up a connection pool based on the database name. If none has been found, we create a new one, and add it 'managed' to the DBs storage. If the database the client is requesting actually doesn't exist at all, this method will throw an exception. Other ways to indicate the issue to the client could also be coded. Then we define a so called virtual dataset on the query service. Right now we have a TkbmMWSQLiteQuery component there, and that will be used for the query against a specific database. However we would like to interact with the client in such a way that we don't just send the complete result (containing the combined matching records for all client specified databases) in one go to the client. Instead we would like to send incremental resultsets. In this sample we choose to send all matching records from one database to the client as one incremental resultset. On the queryservice datamodule (unit2.pas) we put a TkbmMWMDQuery and a TkbmMWMDConnectionPool.
Then we define a field in the Tform1 class, which is to hold the connection pools for the databases we have accessed and thus allow us to reuse the connection pools later on, without having to reopen the database files:
procedure Button1Click(Sender: TObject); procedure Button2Click(Sender: TObject); procedure FormCreate(Sender: TObject); private { Private declarations } public { Public declarations } DBs:TkbmMWThreadHashStringList; end;
Figure 11: We rename the kbmMWMDQuery1 to 'DATA', and set its published property to true and its connection pool to point to the kbmMWMDConnectionPool1 component. That way clients can request data from this particular component. The virtual memory dataset (thus the MD acronym) provides some interesting events for us, namely the OnPerformFieldDefs and OnPerformQuery events. The OnPerformFieldDefs allow us to easily define field definitions at runtime on-the-fly. We could also define them at designtime for this demo because we know the structure of the SQLite table DATA doesn't change for different databases served by this application server. However we'll define the definitions in code. Its also possible to define parameters in the same method, but since we need a way the client query can provide information about database name and some search criteria, we will define parameters for that at designtime.
We add the code required to create the instance upon form creation and destroy it at form destruction. We specify that objects that are are 'managed' by the hashlist are also deleted automatically by the hashlist when entries are deleted from the list or the list itself is freed.
procedure TForm1.FormCreate(Sender: TObject); begin DBs:=TkbmMWThreadHashStringList.Create(100); DBs.FreeObjectsOnDestroy:=true; Server.RegisterService(TkbmMWQueryService2,false); end; procedure TForm1.FormDestroy(Sender: TObject); begin FreeAndNil(DBs); end;
COMPONENTS
DEVELOPERS
Page 69
Then we add the parameters at designtime via the property Params of the DATA (TkbmMWMDQuery) component.
Figure 14: Finally we need to add some code to do the actual search and return the incremental data. This code should be put in the OnPerformQuery event . The First part of the PerformQuery event extracts parameter values provided by the client.
procedure TkbmMWQueryService2.DATAPerformQuery (Sender: TObject; var ACanCache, ACallerMustFree: Boolean; var ADataset: TDataSet); var i:integer; sDBID:string; ds:TkbmMWMDQuery; sDatabase:string; slDatabase:TStringList; conditionLow,conditionHigh:double; cp:TkbmMWSQLiteConnectionPool; bFirst:boolean; mt:TkbmMemTable; begin ds:=TkbmMWMDQuery(Sender); // Get parametervalues. sDatabase:=ds.ParamByName['Database'].AsString; conditionLow:=ds.ParamByName['conditionLow'].AsFloat; conditionHigh:=ds.ParamByName['conditionHigh'].AsFloat;
Figure 12: Three parameters have been created: 1 Database, ftString, ptInput, Size 100 2 ConditionLow, ftFloat, ptInput 3 ConditionHigh, ftFloat, ptInput The purpose of these parameters is to allow the client to indicate database number, and some conditions we may choose to use for the selection from a database. By convention we define that Database may contain one or more comma-separated numbers indicating database number for which a result is requested. Then we define the SQL statement which will query a single database. That can be done at runtime or designtime. Since the SQL is the same regardless which database is queried, we define it at designtime.
Then comes a loop that runs for each database that the client has asked the application server to query. For each database, a connection pool is requested, and a native TkbmMWSQLite query is executed. The resulting dataset (if any) is then sent asynchronously to the client via the SendPartialResultDataset method. The method needs to know if it's the first partial resultset, or an intermediate one. The last partial resultset MUST be sent by returning from the eventhandler with a dataset that it can send to the client. That final dataset will be marked as either the complete dataset (if SendPartialResultDataset was never called) or the final partial dataset (if SendPartialResultDataset has been called at least once). kbmMW keeps track of that internally. Figure 13: The parameters must then be configured in the Params property. We have 2 parameters to define: conditionLow and conditionHigh. They need to be of DataType ftFloat and ParamType ptInput. The purpose of that, is twofold: 1) The server doesn't need to send complete field definitions for intermediate or final partial datasets. 2) The client will know if further partial datasets are to come, or if all have been received.
Page 70
COMPONENTS
DEVELOPERS
The client Now all that's left is to build a client that can talk to the application server. We start out by creating a new VCL Forms application for the client. We add several components: TkbmMWTCPIPIndyClientMessagingTransport 2 x TkbmMWMemoryMessageQueue TkbmMWClientConnectionPool TkbmMWClientQuery TkbmMWBinaryStreamFormat And a datasource, dataaware grid and a couple of buttons.
The final part of the event closes the native query, detaches the connection pool from it, and generates a final (empty) dataset to return. Because we don't know if we are infact processing the last dataset (database) at any point in the loop we send this empty dataset which indicates that it is the last one (we may encounter an empty or nonexistent dataset anywhere in the loop). We also tell the system that it is responsible for getting rid of our temporary TkbmMemTable, when its done with it.
bFirst:=false; end; finally kbmMWSQLiteQuery1.Close; kbmMWSQLiteQuery1.ConnectionPool := nil; end; except // Do nothing. end; end; finally slDatabase.Free; end; // If no data collected, complain. if bFirst then raise Exception.Create (No matching slides were found.'); // Now prepare a final (empty) dataset package. mt:=TkbmMemTable.Create(nil); mt.CreateTableAs(ds,[mtcpoStructure]); mt.Open; ADataset:=mt; ACallerMustFree:=true; end;
Transport.InboundMessageQueue is set to point to qIn and Transport.OutboundMessageQueue is set to pont to qOut. The TkbmMWClientConnectionPool.Transport property is set to point at Transport, and kbmMWClientQuery1.ConnectionPool is set to point to the kbmMWClientConnectionPool component. Further kbmMWClientQuery1.TransportStreamFormat is set to point to kbmMWBinaryStreamFormat1 (matching a similar setting on the query service in the application server). The DB grid is hooked up to the datasource that is hooked up to kbmMWClientQuery1, and Transport.ClusterID property is set to 'Demo'. The QueryService property of the kbmMWClientQuery1 component is set to the name of the service on the server that we would like to provide data to our client query. That is 'DATAWAREHOUSE'. Hence we set the property to that string. As we didn't define a service version on the DATAWAREHOUSE service we should clear out the property QueryServiceVersion.
Finally we need to tell the query service that clients are allowed to access its published query components. That is done via the property AllowClientNamedStatement on the query service data module.
We don't in fact connect immediately, but instead tell the connection pool that maintains connections from the client to the application server, that it can connect when it needs to. Similarly we tell the connection pool that it should kill its pool of connections when the user clicks the Disconnect button.
Figure 15:
COMPONENTS
DEVELOPERS
Page 71
Provided we have the correct SQLite databases available and provided we have configured the kbmMWSQLiteConnectionPool correctly on the application server, its now possible to make asynchronous queries with developer-controlled incremental on-the-fly transfer of partial resultsets to the client.
To summarise what we have demonstrated here: * How to create a basic server and client using // Setup the parameter values for the query. kbmMW Enterprise Edition // Search all databases identified with the values 3 and 4 and 5. * How to work with virtual memory data sets // Search for values in the range 10-20. * How to work with SQLite databases kbmMWClientQuery1.ParamByName['Database'].AsString:='3,4,5'; * How to work with the Wide Information Bus kbmMWClientQuery1.ParamByName['ConditionLow'].AsInteger:=10; (WIB) messaging framework kbmMWClientQuery1.ParamByName['ConditionHigh'].AsInteger:=20; * How to work with developer-controlled incremental resultsets kbmMWClientQuery1.AsyncOpen;
// Now the server will go through all the specified databases, // and search for the relevant criteria, and give us a message // for each non empty resultset, via the Transports OnAsyncResponse // event. end;
And finally we need to write some code to handle the asynchronous responses, of which there will come with minimum of 1 and potentially many for each click on the Query button..
procedure TForm1.TransportAsyncResponse(Sender: TObject; const TransportStream:IkbmMWCustomResponseTransportStream; const RequestID: Integer; const Result: Variant; UserStream: TkbmMWMemoryStream); var ps:TkbmMWDatasetPartialState; begin // Check if response to request made via query component? // Compare LastRequestID from the query component with the // RequestID for the incoming response message. if kbmMWClientQuery1.ActiveClient.LastRequestID = RequestID then begin ps := kbmMWClientQuery1.SetQueryResult (Result,UserStream); case ps of // We got all in one response message. mwdpsAll: ; // No more messages are coming for this request. mwdpsInitial: ;// We got initial response message, and thus // the dataset has now been defined with // structure (fields) and some data. // We got an intermediate response message. mwdpsData: ; // It was appended to the already existing data. mwdpsFinal: ; // We got the final response message for this // request. Now the dataset is complete. end; end; end;
COMPONENTS
DEVELOPERS
SPECIAL OFFER:
AT ORDERING A NEW LICENSE OF KBMMW PROFESSIONAL OR ENTERPRISE INCLUDING COMPETITIVE UPGRADES
40% DISCOUNT
THE COUPONCODE TO REFER TO IS: KBMMWBLAISEPASCAL2010. THIS OFFER IS ONLY VALID THROUGH THE PERIOD STARTING AT OCTOBER 10/2010 UNTIL NOVEMBER 10 /2010
Page 72
COMPONENTS
DEVELOPERS
The Lazarus USB stick is already prepared to be used with this project. If you want to get your own Lazarus version you will have to prepare this and do the installation yourself. Using Lazarus you can build programs which can be compiled for different platforms. One of those platforms is Windows Mobile, also known as PocketPC or Windows CE. This article explains how you can write a simple Windows Mobile application which uses GPS and the embedded database SQLite. I also cover how you can debug the application and how you can test the application with the Windows Mobile Emulator from Microsoft. Multiplatform Lazarus main advantage its support for multiple platforms. Free Pascal, the compiler which is used by Lazarus, can make executables for several platforms. Unlike .Net, Free Pascal does not make something like an intermediate executable which has to run on some framework. But what exactly are the differences between platforms? For Lazarus three aspects are important: the processor, the operating system and the used widgetset. How does this relate to mobile phones using Windows Mobile? The processor: Mobile phones can use different types of processors. Most phones use ARM processors but there are a lot of different types. And ARM processors can also be used with different settings which can be important. But luckily Windows Mobile phones always use the same kind of ARM processor, using the same (relevant) settings. The operating system: Lazarus calls Windows Mobile 'WinCE', the name of an older version of this operating system. At least versions 5 and 6 are supported. The widgetset A 'widgetset' is a collection of graphic controls which can be used to build programs.
On Windows most programs use the default Windows widgetset, but it is also possible to build programs using the QT widgetset, for example.On Linux QT and GTK are the most popular widgetsets and on OS/X there are Carbon and Cocoa. In this case we want to use Windows Mobile and its default widgetset called 'WinCE' . (Note that in this case the name of the widgetset is the same as the name of the operating system. We must be clear about the distinction between widgetset and operating system.). Since it is almost impossible to write the program on the mobile phone itself, we will use cross-compilation. That means that we use a Windows PC to create the binaries for WinCE. Therefore you need a 'crosscompiler' and some extra utilities. On BlaisePascalMagazine Lazarus USB-stick these are preinstalled. If you are using your own installation of Lazarus, you have to download and install the cross-compiler for ARM/WinCE add-in. (Lazarus-0.9.28.2-fpc-2.2.4-cross-arm-wince-win32.exe) Start Building Now we can start building 'Hello World' for Windows Mobile. First we create a PC version, then a version for a telephone. Start Lazarus and start a new program. Place a button in the upper left corner and place the following code in the 'onclick' event:
MessageDlg('Hello','Hello World',mtInformation,[mbok],0);
Save the program and compile and run the program to test it on the PC. If this all works we can configure Lazarus to create a WinCE application. From the 'Project' menu choose 'Project Options...' then 'Compiler Options'. In this screen we select which widgetset to use, from this screen called the 'LCL Widget Type. Note that versions of Lazarus later than the USB-stick version have a combobox dropdown where you can select the LCL Widget Type.. Select 'WinCE' and activate the 'Code' tree node ('Code generation' node in later Lazarus versions).. Set the 'Target OS' to WinCE and the processor (Target CPU family) to ARM. Click on 'OK' to save the changes and then recompile the program. If everything goes successfully you now have a hello-world application for WinCE. You could try to start the application, but that will fail. A program compiled for an ARM processor wil not work on a Intel Pentium processor. To test your program you'll have to copy it to a Windows Mobile Phone and run it there.
Figure 1:
COMPONENTS
DEVELOPERS
Page 73
Figure 3: After the installation there is a new option in the start menu, 'Windows Mobile 6 SDK' with the sub-item 'Standalone Emulator Images' which contains a list of several images with different versions of Windows Mobile. Choose one to run. You will see a telephone on which Windows is starting up. In the File>Configure it is possible to set a 'shared map'. Set it to 'location' where you stored the Lazarus project. This 'Shared folder' is now available on the simulated phone as an extra storage card. On the phone select 'programs' in the start menu and run the File Explorer. Now select the 'storage card' in the upper left corner. Now you can see all the files from the Lazarus project and the helloworld executable file. Click on the program and your phone will tell you 'hello'. Figure 2:
Figure 4:
Page 74
COMPONENTS
DEVELOPERS
= (gps_fix_quality_unknown, gps_fix_quality_gps, gps_fix_quality_dgps); Tgps_fix_selection = (gps_fix_selection_unknown, gps_fix_selection_auto, gps_fix_selection_manual); Tgps_fix_type = (gps_fix_unknown, gps_fix_2D, gps_fix_3D);
TGPS_Position = record dwVersion: DWord; dwSize: DWord; dwValidFields: DWord; dwFlags: DWord; stUTCTime: Windows.SYSTEMTIME; dblLatitude: double; dblLongitude: double; flSpeed: cfloat; flHeading: cfloat; dblMagneticVariation: double; flAltitudeWRTSeaLevel: cfloat; flAltitudeWRTEllipsoid: cfloat; FixQuality: Tgps_fix_quality; FixType: Tgps_fix_type; SelectionType: Tgps_fix_type; flPositionDilutionOfPrecision: cfloat; flHorizontalDilutionOfPrecision: cfloat; flVerticalDilutionOfPrecision: cfloat; dwSatelliteCount: DWORD; rgdwSatellitesUsedPRNs: array[0..gps_max_satellites - 1] of cdouble; dwSatellitesInView: DWORD; rgdwSatellitesInViewPRNs: array[0..gps_max_satellites - 1] of cdouble; rgdwSatellitesInViewElevation: array[0..gps_max_satellites - 1] of cdouble; rgdwSatellitesInViewAzimuth: array[0..gps_max_satellites - 1] of cdouble; rgdwSatellitesInViewSignalToNoiseRatio: array[0..gps_max_satellites-1] of cdouble; Fillup: array[0..287] of byte; end;
Figure 5: If there is a connection between ActiveSync/Device Center and the emulated mobile phone, we can start debugging. Return to Lazarus and place a breakpoint on the line on which the messagebox is opened. Start the program using the remote debugger (F9). Without the remote debugger selected, this would result in the error message that the application is not suitable for Windows. But now the application is started on the phone. You need some patience, though. It needs some time. When the program is running, click on the button and Lazarus will pause the program on the breakpoint. With F9 the program continues, behaving just as you expect while debugging applications. It's important though to know what happens exactly when you debug the application remotely. On the phone a map called '\gdb' is created. Then the program which has to be run on the phone is copied to this map and started. Then the debugger connects to this running application. Note however that when a file with the same name already exists on the phone, the application is not transferred. This means that if you change the program, re-compile it and run it again, on the phone the 'old' version of the program is still used. So you have to remove the application from the map '\gdb\' before you can debug the new version. One of the reasons that debuging is so slow is that copying the file to the phone takes so long. It's possible to speed up this process by excluding the debug-information from the executable while linking the application, and place this information in a separate file. This leads to a smaller executable so it takes less time to copy it. You can find the option for putting debug-information in a separate file in the compiler options, on the link-tab. (Use external gdb debug symbols file (-Xg)).
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
COMPONENTS
DEVELOPERS
Page 75
With GPSOpenDevice a connection is made with the GPS and when necessary the GPS is turned on. The fgpshandle has to be added to the form as a private variable with the type PtrInt. This handle is used to read data from the GPS and to close it down. The error message is selfexplanatory. Now we can replace the code which shows 'Hello World' with something more useful :
procedure TForm1.Button1Click(Sender: TObject); var pGPSPosition: TGPS_Position; res: DWORD; begin {$IFDEF WINCE} If (fgpshandle<>0) or ConnectGPS then begin FillByte(pGPSPosition, sizeof(pGPSPosition), 0); pGPSPosition.dwVersion := gps_version_1; pGPSPosition.dwSize:= 376; // 344 res := GPSGetPosition(fgpshandle,pGPSPosition,10000,0); if res <> ERROR_SUCCESS then begin MessageDlg('Fout', Format('Reading the GPS failed. Foutcode %d',[res]), mtError, [mbOK], 0); exit; end else MessageDlg('Success', Format('Longitude: %g, Latitude: %g. Satelites: %d:%d.', [pGPSPosition.dblLongitude,pGPSPosition.dblLatitude, pGPSPosition.dwSatelliteCount, pGPSPosition.dwSatellitesInView]), mtError, [mbOK], 0); end; {$ENDIF WINCE} end;
The constants are for some regular values. TGPS_Position is a record in which a location is stored. TGPS_Device contains information about the GPS device. The Fillup array in TGPS_Position and TGPS_Device is to work around a bug in several versions of Windows CE. This is explained later. As you can see the FileTime and SystemTime types are used. These types are specific for Windows, so you have to add the windows unit to the uses clausule. This immediately results in an application that won't compile for non-Windows operating systems. Not a problem for our Windows-specific application. Further the ctypes unit has to be added. This unit adds several types which are used in the C-programming language, in which the GPS api has been written. The GPSAPI.DLL has four functions to communicate with the GPS. To be able to use these functions in a Pascal program, they have to be defined first:
{$IFDEF WINCE} function GPSOpenDevice(hNewLocationData, hDeviceStateChange: PtrInt; szDeviceName : PWideChar; dwFlage: DWord): PtrInt; cdecl; external 'gpsapi.dll' Name 'GPSOpenDevice'; function GPSCloseDevice(hGPSDevice: PtrInt): DWord; cdecl; external 'gpsapi.dll' Name 'GPSCloseDevice'; function GPSGetPosition(hGPSDevice: PtrInt; var pGPSPosition: TGPS_Position; dwMaximumAge, dwFlags: DWord): DWord; cdecl; external 'gpsapi.dll' Name 'GPSGetPosition'; function GPSGetDeviceState(var pGPSDevice: TGPS_Device): DWord; cdecl; external 'gpsapi.dll' Name 'GPSGetDeviceState'; {$ENDIF WINCE}
The definitions are bracketed between $IFDEF statements to ensure that this code is only compiled when compiling for WinCE. Free Pascal uses these defines to write code which differs between different operating systems. There are different defines for all operating systems, processors and widgetsets. Here we use the define to make sure that the code still compiles on a system on which GPSAPI.DLL is not available. This way we can use this program also on a regular PC, although in that case the GPS won't work, obviously. Now we want to see the GPS-position, instead of 'Hello World'. Add the private function ConnectGPS to the form:
function TForm1.ConnectGPS: Boolean; var res: DWORD; begin {$IFDEF WINCE} result := false; fgpshandle := GPSOpenDevice(0, 0, nil, 0); if fgpshandle = 0 then begin MessageDlg('Fout', 'Activating the GPS failed.', mtError, [mbOK], 0); exit; end; result := true; {$ENDIF WINCE} end;
The code above first checks if a fgpshandle is available, and if not the program tries to create a connection with the GPS. If this is successful the pGPSPosition record is initialized. It's been cleared completely and then the versionnumber and size of the record are set. As you can see the size of the record is given explicitly. In principle this is wrong, it should be 'sizeof(pGPSPosition)'; instead of 376. The problem is that Windows CE 5 and 6 use a different size for the pGPSPosition. The idea was that WinCE 6 would be backwards compatible with the size of WinCE 5, but there's a bug in some versions of WinCE that breaks this compatibility. That's why the size has a constant size here. If the given size is incorrect, the program will raise an error with errorcode 87. Should this happen, replace '376' by '344' and try again. Later on a better solution will be discussed. After the pGPSPosition record is initialized, GPSPosition is called with four parameters, Firstly the fgpshandle, secondly the pGPSPosition record, and thirdly the maximum number of milliseconds old that the answer can be. That's because the GPS sends information to the phone continuously. It could be that a second ago a location has been send to the phone. If this age parameter is larger then 1000 (1 second) then the call will immediately return the value sent one second ago. The fourth parameter is always zero. After a successful call to GPSGetPosition a messagebox is shown with the GPS-coordinates of the current location. Further you can see how many satellites are used to obtain the current location (more satellites means a more accurate result) and the total number of satellites the GPS 'sees'.
Page 76
COMPONENTS
DEVELOPERS
This reads and displays the GPS coordinates on the screen. If there are no coordinates available yet a message is shown alerting you to the absence of a signal. What is new is that the program tries to setup a connection with dwSize:=376 and if this doesn't succeed with dwSize:=344. Then the size which works is stored so it can be used later. Note that for this to work it is important that the size of the actual record is at least as high as the value given here. The TGPS_Position as we defined earlier is for Windows Mobile version 5 and normally has a size which is too small for Windows Mobile 6. But because version 6 can't handle the format of version 5 in all cases, the size of the record is stretched by adding the unused array of bytes. (TGPS_Position.Fillup) This way it will work on all versions. Now we have to hook up the events for the buttons and timer:
Last Position(6)
Figure 6: Now we add a new private function to obtain the GPS-position. The code is as follows:
Figure 7:
DATABASE SPECIAL 2010 BLAISE PASCAL MAGAZINE
COMPONENTS
DEVELOPERS
Page 77
Figure 8:
procedure TForm1.InitialiseDB; var sl: TStringList; i: integer; begin if SQLQuery1.Active then Exit; SQLite3Connection1.Open; sl := TStringList.Create; try SQLite3Connection1.GetTableNames(sl); if not sl.Find('coordinates', i) then begin SQLite3Connection1.ExecuteDirect ('create table coordinates(LocalTime datetime, GPSTime datetime, longitude real, latitude real);'); SQLTransaction1.CommitRetaining; end; finally sl.Free; end; SQLQuery1.Open; end;
Now on pressing the Start button, the program tries to connect to the GPS and if it succeeds, the timer is activated. The timer makes sure that after an interval (Timer1.Interval), the GPS is checked, to see if there is any new information. This information is shown on the screen. Using the Stop button, the timer is deactivated and the connection with the GPS disconnected. Storage of a walk in the park Suppose we want to walk in the park and save our position every 5 seconds. We want to use a database for that but not one for which we have to install a complete database server. And it should work on Windows CE. That's a perfect match for SQLite (www.sqlite.org). If you want to work with SQLite the only thing you have to do is place the sqlite3.dll in the same map as your executable, or if you want to make it system-wide, in c:\windows\system32. Let's try to get this working on the PC first. This way the {$IFDEF WINCE} are also used for something useful. First download sqlite3.dll and place it in the system32 folder. We also want to access the database from within Lazarus, so placing it in the project-folder is not enough. (It is possible, of course. But in that case the dll should also be copied to the location of the Lazarus executable.) Now place a TSQLite3Connection, TSQLTrasaction and a TSQLQuery from the sqldb-tab on the form. Connect the TSQLTransaction.Connection and TSQLQuery.Connection to the TSQLite3Connection. Set the TSQL3Connection.Transaction to the TSQLTransaction. From the 'Data Access' tab add a TDataSource and from the 'Data Controls' a TDBGrid. Connect the TDatasource.Dataset property to the TSQLQuery and set the Datasource property of the grid to the TDatasource. Finally give the TSQLQuery the following query (SQL): 'select * from coordinates;'. Now choose a DatabaseName for the TSQLite3Connection. The databasename is nothing more then a filename in which the data is stored. In my case that's 'h:\src\pgg-wince\gpsdata.sdb'. You can check if sqlite is installed and configured ok by setting the 'connected' property to 'true'. Because there is no extra tool available to create the table we need, we create the table. Add to your application the private function which follows. This code first checks if the query is already active. If not active, a connection to the database is made. Then it checks if a table with the name 'coordinates' already exists. If this table does not exist it is created with the fields 'LocalTime','GPSTime','Longitude' end 'Latitude'.
The transaction will be committed to save it all. Now you will be able to open the query. Add a call to this function in the OnCreate event of the form, to ensure the table will always be opened at the start of the program. To test and create this simple database and table, we have to compile the application for a PC. Go to the compiler options and set the widgettype to win32/win64. The code that has to be generated, should be for an i386 processor and you have to choose win32 for the operating system. Start running the program in the debugger and you'll see that this fails. This is because the debugger is configured to debug applications for ARM processors, remotely on a mobile phone. Go to 'environment' -> 'IDE Options' -> 'debugger' and set the debugger path to the 'normal' gdb executable. (lazarus\mingw\bin\gdb.exe). Start the application again and then close it. Now check to see if a database file with a size greater than zero bytes has been created. If that's the case you can set the Active property of the TSQLQuery on the form to true. Then the fields added to the table will become visible in the grid. To scale the grid with the form-size on the phone set the BorderSpacing.Around to 5 and BorderSpacing.Bottom to 40. Set Readonly to true and options.dgEditing and options.dgIndicator to false. Set AutoFillColumns to true. This is not all though. To show the columns properly on a small screen the properties of each column have to be set manually. Double-click on the grid and click three times on the 'add' button so that three columns are added. Select the first column and set the 'fieldname' to the 'LocalTime' field and the 'DisplayFormat' to 'hh:mm:ss', so that only the time is displayed, not the date. You can also give the column a suitable title. Configure the other two columns so that the 'latitude' and 'longitude' are shown, with '#0.#######' as DisplayFormat. Next we're ready having configured the controls. Now we have to take care of saving the positions into the database.
Page 78
COMPONENTS
DEVELOPERS
Figure 9:
function TForm1.GetGPSPosition(var GPSPosition: TGPS_Position; StoreDB: boolean): dword; var res: DWORD; dtime: TDateTime; ttime: Windows.systemtime; dist: double; begin {$IFDEF WINCE} ... lMessage.Caption:=Format('Laatste positie: (%d)', [GPSPosition.dwSatelliteCount]); lMessage1.Caption:=Format('%g %g', [GPSPosition.dblLatitude,GPSPosition.dblLongitude]); if StoreDB then begin SQLQuery1.First; SQLQuery1.insert; SQLQuery1.FieldByName('localtime').AsDateTime := now; if GPSPosition.stUTCTime.Year<>0 then begin ttime := GPSPosition.stUTCTime; dtime:= ComposeDateTime(EncodeDate (ttime.Year,ttime.Month,ttime.Day), EncodeTime(ttime.Hour,ttime.Minute,ttime. Second,ttime.Millisecond)); SQLQuery1.FieldByName('gpstime').AsDateTime:= dtime; end; SQLQuery1.FieldByName('latitude').AsFloat:= GPSPosition.dblLatitude; SQLQuery1.FieldByName('longitude').AsFloat:= GPSPosition.dblLongitude; SQLQuery1.Post; end; end else lMessage.Caption:=Format ('Wachten op verbinding. %D satelieten.', [GPSPosition.dwSatellitesInView]); ....
COMPONENTS
DEVELOPERS
Page 79
ApplyUpdates takes all changes in the update-buffer of the dataset and write them to the table. The CommitRetaining is to handle the transaction. So now the application is finished, well, for this article. There's a lot that could be added. For example a system which will measure the distance you travelled. Or something like a service that records how often and where you go by car. But all the basic components are here and the program can be extended in any way you would like. Conclusion It turns out that, developing for Windows Mobile is not that difficult. Configuring all the necessary tools is a big part of the work. You also need to understand factors such as using the right version of sqlite3.dll on your phone, and using the right debugger version. If you have all that set, it's just child's play.
Professional help and support The new www.Lazarussupport.com website has been set up to promote the use of Lazarus and Free Pascal and to make these tools accessible for a larger audience. The goal is to provide all the information that new Lazarus users need. It also offers commercial support for those who want to use Lazarus commercially. Our employees have highly developed professional skills. http://www.lazarussupport.com/ Make contact when ever you need to
Page 80
COMPONENTS
DEVELOPERS
By Scott Walz
3. How deep is the tool's knowledge of other platforms? DBAs and developers often work with outdated, inflexible tools and have to spend a lot of time sifting through online documentation for bits of information that will help them use the tools. This is a waste of time. New tools are out there that offer efficient wizards that automatically convert queries into the appropriate format for the target platform. For example, you can quickly create tables even if you don't know the right data fields and types. The simplicity can be enhanced with a single user interface that is familiar and intuitive.
4. Does the tool enable code organization and the application of standards for DB development? To work quickly, organization is extremely important. Look for a tool that enables you to organize and categorize data sources by platform for easy retrieval. Some tools have customizable Bookmark features The challenges of cross-platform database development and that link you automatically to the resources you need or use most often. management can lead to significantly increased costs, as IT departments Others offer filters that help to remove the noise and enable you to rush to hire new developers to support unfamiliar platforms. What focus on only relevant information. Some tools also enable fast project DBAs need is enhanced manageability, change management and creation in a new platform through reverse engineering; they can automationand the right tools to simplify, streamline and reduce the extract procedures from one platform and apply them to the new complexity of day-to-day tasks. Without such tools, it will be impossible project, and even check in changes automatically to your chosen version to handle the increasing pressure of new business requirements and control solution. Such automation makes adapting to new limited resources, and, ultimately, performance and quality will suffer. environments much easier and eliminates human error. Fortunately, comprehensive toolsets are now available to alleviate some of these pain points. SQL development tools can be particularly helpful in cutting costs and saving times in cross-platform environment. The following five considerations can help you choose the right tool for the job. 1. Does the tool provide a rich user interface? Database development and management tools must fit how you work. Tools that offer a single user interface, no matter what the target platform, won't require you to familiarize yourself with a new UI, and you can start being productive immediately. This is particularly important when bringing on new talent to take over or assist with existing projects. Additionally, a single UI can cut down on ramp-up times as well as training costs. 2. Does the tool have a comprehensive tools menu? During SQL development, you may need to use numerous tools and data repositories. A tool that can link you directly to other tools will help streamline development. Look for solutions that offer comprehensive menus and let you access other important resources with a mouse click. It's also helpful if the tool lets you interface through the menu with database file searches and scripting features, or view visual differences between files and objects. Quick links to other software, such as editing tools, are also desirable. 5. Can the tool streamline and improve coding? As a DBA or developer working in cross-platform environments, you spend a lot of time learning the constructs of a new procedural language and understanding the relationships between data structures and how to form queries. Tools that help you understand relationships between foreign and primary keys are tremendous time-savers. Some tools even enable you to drag and drop relationships into SQL windows by defining the meaning of the relationships in the metadata. By bringing this information into the query builder automatically, you can quickly construct effective queries, even in unfamiliar environments. Such automation allows lets you worry less about query construction so you can focus on writing good code. Look for tools that also provide validation and error checking. Debugging features can save you considerable time and ensure that bad code never makes it past the tuning stage. When choosing the right tool to assist with cross-platform database development, it's essential to know your options and choose a tool that addresses key considerations. Being familiar with what features are available to streamline and automate the more difficult and tedious tasks involved in working with different platforms will make life a lot easier, and help you be more productive, more efficient and more valuable to your organization.
owning the performance of their applications from the application layer through the database layer. If performance issues work their way past unit testing, they typically be found again during load testing in QA and slow down the QA process. If these performance issues manage to reach production, they can wreak havoc on the very service levels the DBA is trying to meet, causing system slowdowns or even outages, which are considered the most dangerous type of performance concern for any business. And if it is determined through a process of elimination that the SQL code is responsible for the performance bottleneck(s), then the developer is often left searching for a needle in a haystack, tuning SQL randomly by shoveling it into a tuning engine that needs to sift through all of the SQL to suggest ways to speed up any and all code no matter the duration or frequency. This is a common symptom of CTD, and it results in building a haystack around the issue and wasting inordinate amounts of time and effort.
COMPONENTS
DEVELOPERS
Page 81
Five Considerations for Choosing an Effective SQL Development Tool (continuation) Once you are able to pinpoint the worst-performing SQL, you can
Five Tips to CTD Relief Profile the Database The symptom of chasing phantom issues can easily be remedied by profiling the database first. This is a practice that can be applied during unit testing in development, load testing in QA and production, and it allows for capturing snapshots of meaningful statistics and data represented graphically to easily and immediately pinpoint performance issues. It also allows for a collaborative workflow between production DBAs, QA Engineers, and Developers who can share profile snapshots and conduct a very focused and effective troubleshooting process. Image DBO15_profiling Discover the Bottleneck Once a profile has been captured, it is easy to pinpoint performance issues via a graphical representation of data that is broken down into three dimensions; SQL statements, events, and sessions (programs/users). A red line represents the number of available CPUs on the target database, and any spikes that break above that line grab your attention you can crop out and drill down into the dimensions and statistics for each spike to determine the root cause and take action. This is a very useful productivity feature that helps avoid troubleshooting irrelevant issues. Find Worst-Performing SQL After you identified the cause of the bottleneck the worst-performing SQL, events, and sessions that are contributing to the performance issues are automatically sorted to the top. You can click on the SQL statements to see a graphical explain plan that presents the execution path the database-specific optimizer takes to run the code. You can also see session details about the specific statement including execution statistics. All of this information is recorded during the profiling session, and by profiling and discovering the bottleneck first it makes it easier and faster to find the worst-performing SQL statement contributing to the bottleneck. Tune SQL Best practices in SQL tuning indicate that the best way to speed up application and database performance is to tune the worst-performing SQL first, but those that suffer from CTD and don't follow the preceding three steps, are unable to identify the worst SQL statements. Instead they find themselves shoveling any and all SQL into a tuner, chasing phantom issues and causing massive inefficiencies and slow turnarounds on performance problems. select that SQL to be run through the tuner. The tuner will verify the database-specific optimizer is taking the fastest execution path, the SQL is written effectively, indexes are being leveraged, missing indexes are created, and the underlying schema is defined effectively for maximum performance. DBO20_Indexes_VST_datasheetimage Stress Test to Validate Performance Gains Once the SQL is tuned, it is important to measure and validate performance gains by stress testing the original, un-tuned SQL code and the newly tuned SQL code side-by-side while running a profiling session and capturing the resultant snap shot. By simulating a number of parallel sessions (user) and number of executions for some duration of time, you can ensure that the SQL you have tuned will in fact stand up to QA load testing and production stress levels. DBO20_ Loadeditor. In Summary It sounds simple enough, but so many developers are still suffering from CTD. It's a shame because help is available. By following a painless five-step regime and using the right tools, any developer can learn to quickly profile and pinpoint the worst-performing SQL statements, streamline their SQL tuning process and validate their work before passing the code back to QA or the DBA for final testing. Everyone will think it was tuned by an expert.
About the author:
Scott Walz
has more than 15 years of experience in database development and serves as senior director of product management for Embarcadero Technologies. In this position, Scott oversees the direction of the company's database product family, while focusing on database development and administration products. Prior to joining Embarcadero four years ago, Scott served as development lead for Louisville Gas & Electric. He holds a bachelor's degree in computer information systems from Western Kentucky University.
The Lazarus Complete Guide will be available in mid December 2010. You can order it at ourweb shop direct. If you pre order the LAZARUS COMPLETE GUIDE you will have a Lazarus USB stick for only 15.00
Page 82
COMPONENTS
DEVELOPERS
Graphical User Interface programs for Windows 32 and 64 Windows CE, Mac OS, Unix and Linnux
Barnsten is Embarcadero's Technology Centre in the Benelux. Barnsten became CodeGear's representative in 2007 when CodeGear closed down their offices in the Benelux. Main products to represent were Delphi, Delphi for PHP, RAD Studio, C++Builder, JBuilder and InterBase. The CodeGear Division has been taken over by Embarcadero in 2008. Embarcadero was at that time publisher of powerful database development tools like ER/Studio, DBArtisan, DB Optimizer, DB Change Manager, Rapid SQL etc. Last year Barnsten merged with Embarcadero's former database tool partner and is now supporting all the Embarcadero tools. As most applications do support databases this is a great fit! You can now find all the tools you need for multi platform development at one company. Barnsten employees have over 16 years of experience with the Embarcadero Tools. Visit the Barnsten website to learn more about us, the tools, special offers and local events! Contact us for a special offer on the database development tools. This offer is for Blaise Pascal readers only! Benelux customers can call us now at +31 23 542 22 27 or send a mail to info@barnsten.com
Page 83