Several sections of code have been reworked to allow for the latest CFLib changes. The code that has been migrated to date all clean compiles.
The ICFLibAnyObj interface now returns std::u16string instead of a pointer for the toString() method, as every object is required to respond to toString() with some sort of value. Returning NULLs is not acceptable, and so has been prevented entirely.
CFSecurityHPKey.cpp has been coded and clean-compiles.
The code depends on libuuid1, not uuid.
The CFSecurityCluster*.cpp implementations have been migrated.
At this point in time, I'm focused on CPP Proto, which is prototyping a migration of the Java foundation layer code to C++14 for Debian-based amd64 Linux, with just enough of the obj layer interfaces to be able to build the foundation layer.
Once that is complete, I'll take a step back and enhance MSS Code Factory 2.9 to support the additional tags required for the Cpp and Hpp equivalents to the Java attributes of the SchemaObj and Table definitions in a Business Application Model.
I don't look forward to enhancing the code for the editor, though. That is going to take a while, even with substitutions by Eclipse editors. Maybe I'll just use vi to do the editing instead; it is a pretty simple copy-paste-substitute pattern to be followed, just in a number of segments of code. I'm going to presume a 1:1 match between Java and Hpp/Cpp tags for now.
Then it is time to work on the rules for producing the foundation layer code, including makefile support if I can.
All of the interfaces defined for cfsecurityobj have had their headers migrated. The implementations (.cpp files) haven't been migrated yet.
The Obj.hpp and EditObj.hpp interface headers in cfsecurityobj have been migrated, and are now properly specified as pure virtual interfaces.
The TableObj.hpp and SchemaObj.hpp interfaces have not been migrated yet.
The headers now use int[16/32/64]t for attributes of the objects, but not for more generic interfaces like compareTo(). Generic types are only used for system and infrastructure APIs now, not application data.
CFSecurityAuthorization has been migrated, which required an initial migration of the Obj and EditObj headers from the second layer of code (cfsecurityobj.)
The interface headers for the foundation layer compile cleanly now.
Slowly but surely I will work my way through this pile of code. Man, this sure has been a reminder of how much grunt work is involved in C++ coding...
The header files for the interfaces of the lowest level code, cfsecurity, have been migrated. At least I think they have.
Massive changes, but I'm still not even done migrating the headers from the Java source, never mind tackling the .cpp files.
Use the dh_make "-p" option instead of moving directories around. That also makes it easier to establish directory references from other projects.
The prototype code has been switched over to using the new CFLib code. It is no where near building, of course, but I've started on the migration to using the new code.
All of the functionality required for CFLib has now been coded, though none of it has been tested. At this point, the APIs and interfaces have been stabilized, and only additional functionality should be added rather than subtracting from the current code base or modifying it substantially.
The CFLibXmlUtil::formatTimestamp() method is now used in the messaging for the Overflow, Underflow, and Range exceptions instead of printing the raw seconds since epoch.
Next I need to deal with my timezone information. I've got the settings retrieved, but I need to figure out how to translate that string into a timezone offset value using standard Linux facilities. I don't care about Windows. This is NOT intended to be cross-platform code. I write for Linux. I don't care about any other platforms.
I noticed that many of the other packages I've worked with return instances instead of pointers when allocating new objects. This semantic seems clear and obvious in retrospect, as Java has a one syntax fits all approach to references, pointers, and instances.
My presumption is that instances returned/allocated within a method are automatically destructed when that method loses scope. You have to "catch" the new instance with a pointer dereference if you want to keep it beyond the scope of the method.
The semantics of the XmlUtil parsers is NOT changing, however -- they continue to return pointers to objects so that they can indicate an empty parse by returning a NULL. The semantic of the formatters has been changed to always return a temporary string instance instead of an instance you have to manage, though.
Getters also return constant pointers so that the caller does not own the reference, and can not alter or destroy it due to the use of the const semantic. I thought about changing those to const references, but I realized that doesn't fit with the way the manufactured code for MSS Code Factory works -- it is heavily reliant on pointers that can be late-resolved, which means there is (at least temporarily) no instance in existence to reference.
I went through all the headers and their corresponding implementations, looking for "orphaned" methods that didn't have matching declarations, or which were declared but not implemented. In some cases, there was just a mismatch of getters and setters corrected, and there were a lot of missing destructors added as well.
I am now comfortable that I have no missing implementations for any of the members or methods of my declared C++14 interface.
There are also no unresolved symbols reported any more.
The instance getter methods are now tagged with "const" so that you can get the attributes of a const reference, but not set the attributes. As Java has no concept of a const object reference, there was no support for this syntax in the original Java code.
Next I want to go through the interfaces and implementations and look for any methods which have been defined but not implemented.
The namespace has been changed to cflib for the sake of consistency with the directory name the include files are found in, the name of the installed libraries, etc. Everything shall be just so, shall it not?
There is a source archive created by the build process that is not a complete set of everything included in the full project source archive. It really should only be used for the sake of providing debug symbols/source code to display when tracing code.
Just a couple of things that were bugging me.
Simply running a "./PackageCFLib210 nnnnn" will clean the builds, move the directories, edit the version tagged files, and run a clean build with production of the .deb package as its ultimate goal.
I love Linux. You can automate just about anything if you're good with command line tools.
The proper version numbers for the shared libraries are now used, although the build system produces shared libraries named "*.so.0.0.0" as the primary copy of the library, to which all other versions reference by symlink.
Similarly, I've overridden the rule from the default Makefile in my Makefile.am to copy the libraries using version numbers instead, and to use spawned shells to establish the symlinks.
The header files have all been switched over to using a "#include <cflib/CFLibBlah.hpp>" syntax.
The main .deb file for libcflib-dev looks like it is a reasonable size for the compiled debug code with symbols included. A stripped file would be roughly 1/10th the size.
The .dsc seems rather small, though. I don't think it is including all the source code, only the headers.
This is the work in progress on getting proper integration with the debian build process. I'm still having trouble getting the shlibs packaged for distribution, but other than that, I seem to have resolved all of my problems.
I am so close I can taste it. But it is late and I'm going to bed.
A .deb file gets produced, but it is too small. It should be 4MB or larger, the same as the installer was. I still have more work to do. In the meantime, you can build the source yourself.
The deb helper build prepares the code and everything, but it fails when trying to package the results for distribution, and I haven't figured out why yet.
The Deb helper skeleton has been filled out, and you can now follow the procedures up to and including a debuild. However, I'm not sure where it ends up putting the built files or the packaged .deb. Maybe I'm still missing steps. More reading will happen shortly.
The dh_make produced skeleton of debian configuration files has been added to the project.
I'll need to modify them accordingly, and continue to work my way through the Debian packaging tool chain. Eventually the "PackageCFLib210" script will take the source tree from source to built and packaged, ready for distribution.
Then I'll finally have a template project for "How To Build A Debian Library" that I can use for my various component libraries of a manufactured project. I want complete integration of the C++, GNU, and Debian tool chains for the manufactured code.
The debug build has been packaged for distribution.
I want full debug information for the GNU tool chain.
The GNU auto tools have successfully produced a shared library in the .libs subdirectory of the build.
The Makefile syntax errors have been resolved, though I'm not entirely sure how/why, but I'm not arguing with progress.
You can now do the usual steps to produce a Makefile from the repository source files, but the Makefile produced has some lines around 480 or so that have a prefix of a space instead of a tab. Manually edit to correct that, and the build still fails.
But it is attempting to build code. I'm that much farther along than I was this morning.
The version numbers used by the various scripts and utilities have been synchronized to 2.10.13863.
The initial GNU auto tool scripts have been added to the project. You can do an "autoreconf -fmi", an "autoconf", an "aclocal", an "automake --add-missing-files", and then an "automake" that does nothing further.
I need to work my way through the auto tool documentation to figure out how to get the tools to package a shared debug library for distribution. I'm hoping the tools like my 2.10.release version number for the library.
The CFLib 2.10 source tree for C++14 has been split off from the C++ prototype project as the standalone project net-sourceforge-MSSCodeFactory-CFLib-2-10.
CFLib 2.10 is the C++14 migration of CFLib 2.9 for Java 8. This a pre-production release for internal use only; use at your own risk.
All source is built using the GCC 6 tool chain on Ubuntu Linux 17.4 amd64 with near-default compiler options. As far as I can recall, only one special flag was added to the Eclipse build rules to support initialization of constant statics for classes and to enable virtual inheritance throughout the code base.
This is the alpha release of the CFLib 2.10 code migrated to C++14 from Java 8.
The migration of CFLibXmlUtil is complete. I need to look into the Java code and see if there were any other utility or support classes I missed that I need to migrate sooner rather than later. But this may well be the alpha release of CFLib 2.10 for C++14.
The const void* Blob methods have been implemented, leaving the XML string parsers. You don't normally need the XML string parsers because the SAX parser processing does the XML string translation for you.
The uuid_t methods have been implemented, along with a wealth of additional type-specific parsers. That leaves the XML string parsers, and the Blob methods. I need to find a base64 encoding/decoding solution...
Only the Blob and Uuid parsers and formatters remain to be coded for CFLibXmlUtil. Then I will be done this big class migration.
The majority of the CFLibXmlUtil methods have been migrated to C++14.
There are still the parser and formatter methods for uuid_t, float, double, and mpfr::mpreal types to be implemented. Once those remaining methods are implemented, I believe CFLibXmlUtil will be complete.
I'll have to give the code a good run-through to make sure I haven't omitted the implementation of any methods.
Two more CFLibXmlUtil methods have been migrated to C++14.
Over half of the methods for CFLibXmlUtil have been migrated to C++14.
This coordinated release is a clean build of the CFLibXmlUtil functionality so far. You can not access most of the methods of CFLibXmlUtil.hpp yet because the implementations are commented out until I get around to migrating them.
I was able to get the system environment into UTC time, save the local timezone, initialize the globals for CFLib related to time, and implement parseDate(), parseTime(), and parseTimestamp() for CFLibXmlUtil without resorting to the Boost libraries. As dealing with Gregorian dates had been my only concern, I no longer need Boost and will be removing it from my system.
The build is only clean because I've decided to comment out a bunch of stuff with #ifdef 0 in CFLibXmlUtil.cpp.
CFLibXmlUtil has been sketch-migrated from Java to C++14. I believe the header is ok, but there is still a lot of work even to get the method arguments straight for the implementations.
There is now a debug library existing on my system for CFLib 2.10 as compiled and packaged by Eclipse CDT.
I believe the successful production of a shared library also means you've resolved all of the symbols in the library. i.e. There are no missing method implementations or constant initializations.
Several more exceptions have been migrated, and the destructors for all the exceptions are at least stubbed as empty methods.
The source for the migration of the Java CFLib 2.9 source to the C++14 CFLib 2.10 library has been appropriately renamed.
There is no way I can see to migrate these classes, as they rely on Java's ability to query the class that defines an object. C++14 uses a different technique, where you attempt a dynamic cast and a NULL result means the cast couldn't be completed.
There is no way to dynamically specify the target class of a dynamic cast in C++14, so you just can't get there from here.
Fortunately there were no uses of either class directly in CFBam 2.10, so I was able to get rid of the classes. If they turn out to be used by the internals of some part of CFLib or CFCore, I'll deal with it in some other fashion.
The CollisionDetected exception has been tweaked, and the EmptyArgument and InvalidArgument exceptions have been coded.
CFLibCollisionDetectedException, CFLibDbException, and CFLibDbUtil have been migrated. That leaves 18 more objects to deal with on the initial migration of the CFLib exception framework and essential utilities.
The initial migration of the Overflow, Range, and Underflow exceptions has been completed. In the future, I'll add support for construction with int8_t, uint8_t, uint16_t, uint32_t, and uint64_t arguments to flesh out the type support for the "normal" set of C++14 integer types.
I just continue to make progress with whipping the code for CFLib into shape.
There is still a long way to go.
The CFLib class headers have been reworked to allow for memory management by switching most instances of const std::u16string* to const std::u16string&
The CFLib interfaces now specify virtual destructors, and the string members are now owned instances of the objects, as are any ICFLibAnyObj instances that were passed as arguments to them.
In cases where const ICFLibAnyObj* had been passed to the exception constructors, the const has been removed. The exception implementation takes ownership of the object and will destroy it accordingly; once the constructor returns, the caller should not destroy the instance that was passed as a construction argument.
In the case of strings, a copy of the argument string is taken, so the arguments to the exception constructors are still designated as const, indicating that the caller is still responsible for destroying that instance.
I've also removed the CFLibBigDecimalUtil.hpp file. I'm not ready to start migrating utilities unless they're used by the implementation of the exceptions.
Next I need to refresh the .cpp files to add implementations of the destructors, take copies of argument strings, and destroy the string copies and ICFLibAnyObj* members in the destructor implementations.
Java also assumes all objects are virtual, so I start right from the core of ICFLibAnyObj. That is where I gather the "common functionality" I expect all my object implementations to provide, which is based in the most part on the Java Object contract.
I coded a couple more CFSecurityCluster objects and interfaces.
All objects produced by MSS Code Factory will respond to the extended interface of ICFLibAnyObj compared to the Java interface.
Java already provides the additional methods as part of the generic Object contract, but C++ has no such niceties.
Well, at least one object clean compiles now. It isn't a small one, either.
I'm thinking of making the buffers derive from ICFLibAnyObj, and having them always return false when asked if they are an instance of a class, and to return nulls for the navigation methods because buffers don't know anything about objects.
The same would apply to the keys.
If HBuff and HPKey aren't descending from ICFLibAnyObj, they need to do so as well. Except that HBuff returns the historical resolutions of the containership hierarchy, which may result in null resolutions of current data if an instance has been deleted.
That way the compareTo() and equals() methods would be valid for those data types.
Then I can add the compareTo(), equals(), hashCode(), and toString() templates to the ICFLibAnyObj interface.
The source all clean compiles, but Eclipse CDT won't turn it into a shared library for some reason. I may have to just write my own makefile. No big deal. It isn't like it would be the first time.
Of the pieces of CFLib that I've ported to C++14 so far, the CFLibFiringList.cpp file is the only one with an error. All of the headers are clean. But I've thought about it, and I can't think of a way around this problem.
The exception hierarchy and the factory constructor for exception instances has been ported to C++14.
Just a checkpoint of the work in progress for CFLib 2.10.
I've started working on migrating CFLib in the cfsecurity210 project for now; later I'll move the code to the separate library I have in Eclipse and figure out how to reference a library from another project with CDT.
The .cpp skeletons have been copied from the originating Java 8 source, and decorated with the includes and namespace references. A lot of work has been done, though much remains before I'll have a clean build.
The headers now compile. Remember I am not doing a full migration of CFSecurity by hand. I've migrated a set of core objects, and will now proceed with the C++ implementation of those core objects.
The Java String references have been replaced by the C++14 equivalent, std::basic_string<char16_t>.
UUID has been replaced by uuid_t.
The CPPProto project will be a hand written partial migration of the CFSecurity Java implementation to C++14 for AMD64 Linux systems with the GCC 6 tool chain.
Once I am satisfied with the prototype, I'll start working on rules for MSS Code Factory 2.9 to produce the core object libraries and a thread-safe Ram implementation.
The extensions .hpp/.cpp are the best option for file names for C++14 nowadays. There was a time when I used .hxx/.cxx, but I think it was the Microsoft tool chain that liked that. I'm not targeting Windows at all. Either implement standards or bugger off.
The hashmap of primary-keyed objects would be of shared_ptr references, owning the objects. The index maps would use weak_ptrs, as would the collections contained by the objects themselves. The objects would have a shared_ptr reference to the collection members themselves. I haven't decided who owns the PKey instances yet. I may need to implement a detach() method if I haven't already done so in Java. I forget whether I did or not.
forget() is different -- it deallocates the member that is in the hashmap of primary-keyed objects, so that none of the shared_ptr references can be resolved. When a sharted_ptr is unresolved, the read() for the non-null instance is performed automatically to create a cached copy of the referenced instance.
I'd need to seriously restructure the code for the core cache, to deal with allocation and deallocation properly. I think I can isolate the deallocation to forget() with some careful code structuring.
For the TableObj implementations, the code would look something like:
HashMap<PKey,std::shared_ptr<Object>>, and the indexes would be
The TableObj is responsible for allocating and deallocating copies of the Objects PKey member for it's collection member keys.
The Ram database thread-safe as well, by grabbing mutexes over the members of the RamXxxTableObj.cpp implementations via the RamXxxTableObj.hpp interfaces.
I do not want to thread lock over any finer granularity than a table, as Ram databases are only intended for very simple applications, like editors.
There will be no ACID considerations for the Ram implementation in C++ any more than there are for the Java version. I just want to be able to make the C++ code thread-safe.
C++ on a 64-bit platform will allow me to escape the bounds of the 32-bit JVM, which is a limitation right now.
I can't implement a Java version of CFUniverse, which is to be the conglomerate of all MSS Code Factory 2.10 projects except for CFDbTest, including CFAccounting and Mark II. I want a *global* server image that you run specialized client interfaces against.
There is no need to rework the databases, only the database interfaces. I need the Ram implementation first.
In the long run, I want to fully automate the manufacturing and build processes under a C++ implementation. I'm even considering creating an MSS Code Factory 2.10 as a C++14 implementation. We shall see.
I know I'll need to modify MSS Code Factory itself to define the C++ implementation and interface customizations in a similar fashion to the JavaXxx attributes of the Table and SchemaDef specifications. They will be names CppXxx, though there may be fewer or more customizations available.
I'm hoping to emulate the Java code structure as much as possible, with one C++14 library per package, though of course the directory structure for the cplus rule base and implementations will be different from the Java versions, as you don't specify full ownerships with C++ code. C++ just has the .cpp files in the package directory, and an include subdirectory with the .hpp files.
This is going to be a fun year, I think. This is going to take a long time to learn what I need to know about modern C++ and implement the prototype, never mind coding the rule base.
There was still a lowercase table name being used by the delete scripts in the ClearDep implemenation. This has been corrected, and verified through painful testing with the Mark II Chatbot.
There was a problem with the MySQL rule set for MSS Code Factory, which has been corrected (join tables were being specified with lowercase names instead of mixed case.)
The source code for the MSS Code Factory 2.10 projects has been successfully extracted from the GitHub repositories into a clean directory tree, remanufactured by MSS Code Factory 2.9.13823, clean built, and repackaged as a refresh deployment.
It turns out that commons logging doesn't seem to like Log4J 2.x.
All of the databases should now have code that supports the self-referencing DelDeps.
You should also be able to specify self-referencing ClearDeps, with the same restriction that only the terminal node of the dependency chain can be self-referencing.
As far as I know, only the Mark II project actually makes use of this new feature at this time.
The MySQL connector has been upgraded, and all of the relevent scripts in the 2.10 bin path have been updated to reference the new jars, as have the build.xml scripts.
MySQL 5.7 syntax is now supported and has been verified under Ubuntu 16.10.
The DelDep relationship chain can now terminate with a self-referencing relation, so that you can build trees of nodes. I don't think it will work with subclass hierarchies, though, only flat tables. This feature is only used by the Mark II project at this time, not the 2.10 projects.
The SAX RAM loaders will now execute properly because they initialize the security information for the connection to the RAM instance. The requirement has changed -- you must always use connect( "system", "system", "system", "system" ) for RAM databases.
The copyright headers have been updated for 2017.
The build and packaging scripts have been corrected to include the XML package and the SAX RAM Loader package.
All of the projects now support RAM database imports of the standard loader files of the manufactured SAX XML parser. This code was manufactured by MSS Code Factory 2.9.13795.
The adjusted RAM implementation has the same login/logout and transaction semantics as a persistent store now, though it cannot actually roll back failed transactions.
You can now let the security forms reference a RAM schema instead of an XMsgClient, and after validating your access to the application's security keys, you are automatically logged in as system/system against the RAM database without providing a user name and password.
All of the 2.10 projects have been remanufactured, rebuilt, and repacked using the code produced by MSS Code Factory 2.9.13789. Although the bugs corrected by 2.9.13789 should not affect the 2.10 code base, I like to remanufacture the entire set of code produced by the tool when a major defect is corrected, just in case.
CFLib and CFCore are now released under an Apache V2 license, so the 2.10 projects have been rebuilt and repackaged to leverage the relicensed code.
All of the 2.10 projects are now under the Apache V2 license.
I realized that while I am a GPL enthusiast, most of the third party code I rely on is under the Apache license. I felt the 2.10 projects would find wider acceptance with the new license.
The Apache jars have been updated to the latest versions, and CFLib 2.9.13778 and CFCore 2.9.13779 were used to build and package this latest release.
The packaging scripts have been corrected, and the builds repackaged. The naming of the HTTP jars had changed.
MSS Code Factory CFAsterisk, CFBam, CFDbTest, CFEng, and CFFreeSwitch are now distributed under a GPLv2.0 license instead of the former LGPLv3 license. As the sole contributor to the maintenance of the code base, I can make this license change without having to get agreement from a developer community.
After some consideration, I realized that the LGPLv3 attempts to restrict the use of code in cloud environments. The cloud is a fact of life. Any attempt to restrict the use of code in the cloud is asinine.
All of the projects are now built and shipped with the LGPLv2 versions of CFLib and CFCore as well.
I intend to explore the information available through the WordNet 3.0 database with the 3.1 files overlaid on top of the 3.0 core. You will find the extracted files in the bin/data/WordNet-3.0 directory of the installer and source tree.
The initial skeletal model of the English parse objects has been completed and is ready for review and experimental use.
This is a clean compiling version of the core objects for CFEng. There never will be a persistence layer for the parse objects, so only the core jar and the Ram storage jar are produced.
This is the first cut of the English syntax model, CFEng. All I can really say for it is that it manufactures code. It is not a valid model for doing development with yet.
I'll use the in-memory storage and services to build the parser. This is the "knowledge structure" that the parser component will be trying to build in-memory as its "understanding" of a piece of text.
All of the third party libraries have been updated to the latest versions as delivered/supported by Ubuntu 16.04 LTS.
I need to create a fresh work environment from the git repositories. I've extracted the git repositories and updated their configurations, including wiring a new SSH key from this new development installation. Keys should be refreshed instead of brought forwards from time to time.
The initial goal of this project is a command line utility which lets you specify the old and new models to be compared for creating diff scripts. I can load entirely separate copies of the old and new versions of a database model (Business Application Model.) In fact, it will be quite trivial to do, though I'll obviously have to make some assumptions about prev-next/old-new argument ordering for the command line API.
That will give me two in-memory images of entirely separate models. So now I need to compare and diff the the two models, tying them together with optional attributes referencing the "DefiningXxx" of the current object. There would be a GetDefiningXxx()/SetDefiningXxx(Obj value) accessor pair for the defining object, which is a reference from the new model to the old model that it is defined by in the comparison. There are "GetPrevXxxAaaa" accessors used to expose the previous Xxx attributes specified. You aren't expected to use Setters through these APIs, so only Getters are wanted -- there are no SetPrevXxxAaaa( Obj value ) accessors. There should be un-decorated GetXxxAaaa() getters on the object that retrieve the new/current values of the attributes in question.
Once all of the old objects and attributes have been mapped by setting the DefiningXxx of all the objects between the compared models, you are left with a list of the objects that have been removed from the old system in the new model, and a list of new objects that have been added to the new model that weren't in the old system.
Creating the drop scripts and create scripts for the database changes alone becomes a relatively trivial exercise.
Another change is I need to add a database version string singleton to be stored indicating the current release level of a given database instance.
I'll also need to add a command line argument for specifying the output directory where the new database scripts are to be written. I don't want to make assumptions about the naming, as I'm providing flexibility with the inputs so I don't want to get stubborn about outputs now.
There are no changes to the code when remanufacturing with the Oracle 8 JDK. This contrasts with earlier results from a few years back, when Oracle's JDK was producing invalid results for manufacturing runs. The bug has been fixed during the intervening years; Oracle's JDK now runs for all known test cases.
Rebuilt with Oracle Java 8 JDK for Linux 64-bit instead of OpenJDK 8.
Remanufactured by MSS Code Factory 2.9.13739 and rebuilt by OpenJDK 1.8.0_91 using CFLib 2.9.13737 and CFCore 2.9.13738.
I decided I was abusing PageData for CFCrm 2.10 just because it was easy to do, so I reverted back to non-paged data for those tables. Paging is a clumsy user interface that should only be used when absolutely necessary.
Contact lists and their components do not grow without end like accounting data does, so it makes more sense for people to split up their contacts into multiple lists if they are having performance problems than it does to force them to click a "more data" button over and over during normal use of the application(s).
The latest build as manufactured by MSS Code Factory 2.9.13731-SP2 provides full support for page data by the JavaFX ListPaes, PickerPanes, PickerForms, and FinderForms.
Development will now focus exclusively on Code Factory Accounting 1.2, with any bug fixes to MSS Code Factory itself rolled out as they are encountered. However, all of the new code has been tested, so I am not expecting any bug fixes in the near future. But then again, what programmer expects bugs in their code?
The PickerPage and PickerForm now support page data for JavaFX.
The common lookups defined by CFSecurity and CFInternet no longer enable page data, so all of the databases for the 2.10 projects need to be recreated. This version of the 2.10 applications is not compatible with earlier database models.
The ListPane objects for JavaFX now automatically implement page data support.
There was an occurance of a reference to the 2.7 CFBam schema in the CFBam Editor's save/export code. This has been corrected to reference the 2.10 version of the schema.
There were focus propagation errors in the JavaFX ViewEditPane code which have been corrected. This was preventing the display from being properly refreshed after certain data changes were applied.
The latest code as manufactured by MSS Code Factory 2.9.13721 Service Pack 1 adds support for the PageData="true" attribute of a model's tables. See the notes for the 2.9 releases for details.
With the completion of the XMsg enhancements for the page data methods, the new code is now ready to be tested.
All of the databases now have the code in place to invoke the new stored procedures for paging data.
I still have to do the Oracle JDBC enhancements for the new stored procedures, but the code has been written and clean-compiles for DB/2 LUW, MySql, PostgreSQL, and SAP/Sybase ASE.
All support for Microsoft SQL Server has been removed from all of the projects because I have no way to test that database at this time, so I can not have it on the official list of supported vendors any more.
All of the 2.10 projects now clean build despite the changes I'd made to the Java implementation at the foundation of the system, and despite the significant number of CFSecurity, CFInternet, and CFCrm table objects had PageData="true" added to their specifications. I wanted all of those big fat lookup and security tables to be pageable. It is unlikely anyone really wants to looks at that raw data, but I make it possible to do so without hitting the network for a full set of umpteen records every time they open a record browser over a global lookup.
From here on in, all 2.10 projects are affected by changes to the page data implementation, because even CFSecurity has tables with page data enabled.
The PostgreSQL JDBC implementation now defines the new prepared statement variables for the new page data queries, and cleans them up as part of the releasePreparedStatements() implementation for the table.
The Obj layer enhancements have been completed and now support the new data paging interfaces. The back end implementations are still not-implemented-yet stubs, but the supporting logic has been coded.
Each of the persistence layers implements a set of not-implemented-yet stubs for the new paging interface methods. The interface specification itself should remain stable while I flesh out the implementations.
The Oracle version of the page data read procedures install without errors, as verified through CFAccounting 1.2.
The SAP/Sybase ASE page data queries now restrict their result sets to 25 rows.
The SAP/Sybase ASE version of the page data read procedures install without errors, as verified through CFAccounting 1.2.
The MySql verson of the page data read procedures install without errors, as verified through CFAccounting 1.2.
The DB/2 LUW version of the page data read procedures install without errors.
The new PostgreSQL data paging stored procedures install successfully now.
The new PostgreSQL stored procedures are ready for a test install. They have been wired to the crprocs_schema.[bat|bash] scripts.
The new sp_page_dbtablename[_all|_by_suffix] stored procedures have been coded for PostgreSQL, but not installed to the database nor executed yet.
The PostgreSQL rules for the crsp_page_dbtablename_all.pgsql file make use of the new GEL spread statement, and a manual code review has been done on that stored procedure. Note that the new procedures are not wired in to the bash scripts yet, so they don't actually get created.
All of the database stored procedures now sort the duplicate key result sets by the descending primary key for the table being queried.
This is the latest code as manufactured by MSS Code Factory 2.9.13673.
Only the primary key index has to be in descending order when PageData="true" is specified. The other indexes over the table can be ascending or descending according to the whims and wishes of the designer.
Clean build of all 2.9 projects as manufactured by MSS Code Factory 2.9.13666-PROD.
CFFreeSwitch 2.10 builds clean.
CFAsterisk 2.10 builds clean.
CFDbTest 2.10 builds clean.
CFCrm 2.10 builds clean.
CFInternet 2.10 builds clean.
CFSecurity 2.10 builds clean.
Initial code from manufacturing run for 2.10 projects by MSS Code Factory 2.9.13658.