Welcome to Club Seiden! The Community Forum is open and free to all. Register by clicking “User Login” on the left, then clicking “Community Forum.”
A good way to speed up your PHP application is to have a persistent connection to the IBM i DB2 database. This does, however, carry one implication that might need consideration when modernizing your legacy applications. With persistent connections “shared read” locks, or *SHRRD locks are created on the objects you are accessing. Via Alan Seiden (he created the two methods below) there are two ways to handle these locks to dismiss them. One quick and dirty, the other a bit more elegant.
1. Quick and dirty:
If you just restart the Apache instance, all corresponding QSQSRVR jobs will be reset, clearing the locks.
You can add the command below to the evening job stream that later allocates/clears the file. The command will ask any jobs having locks to release those locks. In my experience, this has worked perfectly with QSQSRVR jobs that hold *SHRRD pseudo-locks on the object.
For file MYFILE:
ALCOBJ OBJ((MYLIB/MYFILE *FILE *EXCL *N)) CONFLICT(*RQSRLS)
if a member:
ALCOBJ OBJ((MYLIB/MYFILE *FILE *EXCL MYMEMBER)) CONFLICT(*RQSRLS)
Make sure you add a MONMSG immediately after the ALCOBJ command to handle any messages such as “cannot allocate…” which may arise normally.
And there we have it!
Based on this video from Gary Hockin (http://blog.hock.in/2014/05/14/maximising-performance-in-zf2-phpuk/) and tips through Alan Seiden I have written this guide to help anyone speed up their ZF2 app. The biggest improvement I have seen so far has been going from the standard autoloader to the classmapautoloader. My load time has gone from 700-900ms (based on zenddevelopertools) to 100-300ms. As a guideline you want your pages to try and load 200ms or less on average.
1. Cache! You can cache your module.config.php to help improve response time. Here is a great article from Rob Allen on how to not only cache your config, but also only have it turn on in production. Caching your config when you are still making changes to it can cause quite the headache!
2. Do not use the standard autoloader. Generating a classmap for each module for the BIGGEST performance increase. It can be a bit manual because you will need to go into each module root directory to run it, but the end result is worth it. There is supposedly a way to do this with composer but I have not done it. I also develop on my Mac so I did this piece on my personal computer. I went into the module directory
php classmap_generator.php –overwrite[/code]
The -overwrite will take care if one already exists. Full instructions here:
3. Always make sure you have a favicon.ico (as well as auto-requested files by the browser). Not having these files forces the framework to double the amount of calls for each little image file and slows down your website. The skeleton app has one built in to help out with this.
Building web APIs is becoming an increasingly common method of reusing business logic across client platforms. The Zend Framework team has created an API builder called Apigility to help us construct our APIs. Version 1.0 has been a wonderful starting point for those of us trying to automate the tasks of creating controllers, routing configuration, filtering and validating input, etc.
Yesterday (4/16/15) the Apigility team released version 1.1, which includes a few very useful new features and enhancements. These include performance improvement for the administration UI (via a complete rewrite), the ability to create deployment packages, and per-API authentication. You can see the full change log here.
For CRUD-based systems there is now a Database Autodiscovery feature that allows you to specify which tables in your database you’d like to expose via web services and automatically adds basic validation based on the table’s column types. I’ll be interested to see how this works with IBM DB2.
If you’d like help getting started using Apigility to build web APIs check out the Apigility resources in The Learning Hall.
My company is creating an API using ZF2 on Zend Server via the PHP Toolkit. We are keeping our backend logic on RPG and the API allows us to call it via our new interface. One of the biggest questions of development when you start separating out your pieces is where and how you should be validating your data and which piece has what responsibility.
As a general guideline you want to have your parameter validation done in your view or PHP and your logic validation done in RPG. Let me expand/explain:
Parameter Validation: This is validating the type of data you will be using to call the API. The checks you will run are, for example, if the field is alpha or numeric. If the field is numeric, does it have any decimals or allow negative numbers.
Logic Validation: Once the data has been validated that is the right format and type of data, your API will validate if the call is correct. For example if you are wanting to delete a product the parameter validation will make sure the product ID that was submitted was in the correct format and then the API will determine if the product actually exists or is even allowed to be deleted (aka logic).
By having this plan in place when developing or expanding your project your developers can know who carries what responsibility.
Zend recently released Zend Framework 2.4 and with it brought fixes, enhancements, and long term support! With Zend Server it makes it very easy to update the framework installed on your system and used by default (and revert back to earlier framework versions if needed) so I am always one to try the newest released versions.
Yes… I am one of those early adopters you always hear about. Though with Zend Server and how it manages the libraries for ZF2 it really is a ‘low risk’ situation and I want my company’s software product to be used and effective in the latest version when possible. So I took the plunge and updated like I had for the past 6 updated releases of ZF2 and it had gone over flawlessly. This time… I had a bug.
Whether in my ignorance or newb-ness I attributed this to some time of backward compatibility break or change of functionality. Also, considering that IBM i and DB2 are a little different (a good different mind you) on the back end I wondered if some of the support for our beloved machine had been broken. I contacted a guru in the ZF2 world named Samsonasik. He is a fantastic coder and has helped me MANY times. I had him look at my code, the things I was doing, and see that my code worked fine in ZF 2.3.7 but broke in ZF 2.4. At the end of our session he suggested I post an issue to the GitHub repository for ZF2 and he would comment on it.
In this moment, dear reader, understand I felt like I was really helping out our IBM i community! I had ventured forth, tried something new, and found an error that couldn’t be explained by my coding! I was making sure the IBM i community had a working Zend Framework with which they could develop their applications on. I dare say I was a little proud of myself.
I posted my Issue to GitHub and the feedback was pretty quick from the developers and community. Lots of people asked about my coding or caching of the server… all presented answers came back that I had found a legitimate bug. It wasn’t until the head of the ZF2 project got on and suggested I had caching on… but it was in some of the main core files. There.. sitting in public/index.php was a line from EDPSuperLiminal that caching was still on and the culprit. I commented out this line and my code worked fine. Issue closed.
I write all this to tell you, dear reader, that you need to find a way to turn off and clear your cache when you are upgrading your framework. Save yourself some heartache and troubles and setup a plan for this so you can have an easy upgrade path on your server. Hopefully my story saved you some stress and embarrassment!
One of the most important decisions when modernizing your application is deciding what the overall blueprint for your stack will be and which parts of your development team are responsible for what. The overall makeup of what languages you should use, how they should communicate together, and where your Business Intelligence should
For example, our current application is completely green screen and RPG based with code that was started 25 years ago and has been updated and improved annually at every customer site for the past 25 years. It is comprised of a giant processing job run every night (we call it our Night Job) and an interface based on DDS. That age of code could seem like a liability to some, but in our world (Inventory Management and working with the Buying Process) certain algorithms and routines takes years to test if they are producing accurate results.
When we started trying to decide how to modernize we knew that we didn’t want to abandon our BI that exists in RPG and has done a remarkable job. To start over and write these routines in another language would lose our advantages over the competition. But the front end had to evolve and modernize to meet our current and future customers expectations. The night job has potential of one day being updated to Node.js when the maturity level of the technology reaches a certain point and the availability on IBM i becomes more widespread.
We decided our users will have a web based front end built on Zend Server for the IBM i. If you picture a traditional ‘CRUD’ interface we have this front end responsible for just the ‘R’ (review). This will use PHP, ZF2 (and migrate as we go), and Twitter Bootstrap. ZF2 has built in DB2 support that allows for abstraction of the function to build our multiple screens. When it comes to displaying data and changing workflow this stack allows the
Where this idea really shines though is the limitation of our front end developers not being allowed to C, U, or D to the database. While updating a field in our database is simple enough using ZF2, the important piece that is lost is all the considerations our current setup must take into account. If a vendor belongs to a certain buyer group and the user changes that buyer group it is possible there is up to 200 other updates to the database that must be made so that change is effective, causes no problems, and is logged appropriately. Instead of trying to teach the ZF2 dev team how to understand the BI we have set the RPG team (aka the backend team) the task of creating an API for our web application to call whenever CUD (and I will throw in a 5th term, Calculate) must happen. Our RPG dev team is creating a library of single purpose RPG programs to accomplish all the tasks needed. For example: If we want to delete a Vendor in our system it is not as simple as just removing the vendor record. There can be thousands of products that would be abandoned as well as many checks to see if the Vendor is active in our processing. The user (and the front end team) just wants to delete the supplier and not have to worry about all the considerations when calling the program. Our RPG program (which is actually a CL program calling an RPG program in case there are errors to handle) takes in our parameters, does all the necessary checks, and either returns a success or failure code with a fail message to be displayed to the user. All the experience and knowledge of our RPG programmers to our business remains where it should. We also are taking the time to comment in our API what is happening and why so that future updates to move to another language will be easier.
With all these single purpose API calls we the web dev team now has the ability to do some ‘cool stuff’. On install of our web app the user has a chance to set the API. This is just way to future proof and allow us to migrate to another API if we decide to move off RPG (we have no plans for this but it pays to plan ahead!). Just a simple key value pair in our service manager:
return array ( 'k3s_settings' => array ( 'api' => 'rpg', ) ); Now that that is set we create a ZF2 module to house all the RPG we need to call. We create a controller to handle RPG programs that work on a specific concern. If we are going to do operations on the vendors we put the CUD operations in that one controller for vendor. We then use the action to decide which program we are going to call:
Instead of returning a view model so a user can go to a screen, we want to return JSON. We have now created an API access to our RPG API. This might seem convoluted, but it allows some really neat stuff. We can now use jQuery tools to call some RPG when we need to update. Using a tool called jEditable I am taking in 50 parameters, checking for all considerations, updating my vendor, adding all additional changes to the database needed, and logging what the user did. To the user it was simple and quick and painless! To the web dev team they don’t know how difficult the operation was that was just performed. To the back end team they aren’t concerned with all this UI interaction, they just have their responsibility of the interface and it worked!
Now that we have this setup and have correctly given responsibility to the right teams we are able to develop faster and stronger application without losing the fidelity of the current program.