This release contains many improvements and will probably be the last one.
Now druid is position independent. Simply issue:
java -jar <path>/druid.jar
Note however that you need java 1.4
------------------------------------------------------------------
MAIN FEATURES
- One config file for each user
- Added jedit syntax highlighting package (for both code and sql)
- Added modules for various file formats (html, xml)
- Improved java module (added jdo support)
- Added velocity template engine (with docs) (thanks to Misko Hevery)
- Added triggers to tables
- Added comments to tables and fields
- Added castor module (with docs) (thanks to Misko Hevery)
- Added docs for issues with postgresql (thanks to Nuno Rodrigues)
- Improved sql generation module
- Many small improvements
With the velocity engine (included) you can write your own templates
and generate whatever you want. Read the docs to know how.
------------------------------------------------------------------
JDBC
The jdbc part has been totally rewritten. New changes are:
- Design for huge databases
- Improved DDF format (now supports binary, long varchar, blob and clob)
- Improved data editing and visualization
Now druid can edit longvarchars end clobs with the record editor
and can show all kind of binary data (binary, varbinary, long
varbinary, blobs). The druid data format (DDF) has been improved
to support these types for import / export.
KNOWN ISSUES
*** ORACLE ***
Cannot import blobs and clobs because oracle tries to cast
the blob to its class raising an exception.
longvarchar are NOT edited using editable resultsets because
the setCharacterStream doesn't work. This imply that when
you edit a longvarchar with the record editor, make sure that
the visible fields form a key for that record.
*** HSQLDB ***
you cannot edit longvarchars because the method setCharacterStream
is not implemented.
there is a problem when editing real values in the data grid. If you
create a field with type (for example) decimal(10,4) the driver
doesn't return the correct information about the type. Druid
finds an integer and doesn't allow you to enter real values.
This is due to the ResultSetMetadata.getScale method that
returns always 0.
------------------------------------------------------------------
CHANGES
The file format has changed. Files saved in this format can be
loaded by any druid release (starting with this one). By the way
druid is still able to load files in the old format.
For a complete list of changes see the usual docs/versions.txt file.
Cheers
Andrea Carboni