Saturday, April 16, 2011

New FireGSS version for Firefox 4.0

FireGSS is the desktop client of gss and the Pithos service, in the form of a Firefox extension. Yesterday, I tested it on Firefox 4.0 for the first time (a little bit late, I know). All looked well, except two things: a) The version number in the about box was displayed as ... undefined! and b) all menus were something like transparent (definitely not usable).

What 's wrong?!? Firebug and chromebug to the rescue! Firebug console displayed an error message in line


var extManager = Cc["@mozilla.org/extensions/manager;1"].getService(Ci.nsIExtensionManager);


It seems that Firefox 4.0 has changed the way to programmaticaly access the extensions manager. The new way is

Components.utils.import("resource://gre/modules/AddonManager.jsm"); //Load AddonManager
AddonManager.getAddonByID("addon id", function (addon) {
//Do something with your addon, we get the version number here
});


As you see, the new way is asynchronous and uses a callback function to return the results. That required some refactoring on our part to make sure that the about box is not displayed before we have the version number.

The second problem was that we defined the menus as popup elements inside a popupset element. Another thing that is no longer supported in 4.0 is the popup element as it has been replaced by the menupopup.

So after those changes the plugin was playing well with Firefox 4.0 but no longer with 3.*. Since it does not have any other functionality enhancements, this is not a problem. Users of Firefox 3.* can continue using FireGSS v. 0.18 and users of Firefox 4.0 can upgrade to 0.19.

Until we resolve some issue with the update site, you can manually update to version 0.19 by downloading from here.

Sunday, April 10, 2011

Using Solr as a fast database cache

In gss (the open source project that the "Pithos" service and mynetworkfolders is based upon), we use apache solr for full-text indexing and searching of the stored documents. However when a user searches for some terms, we have to show him only results from documents that she has read permission on, i.e. her own documents, document shared to her by others and documents made public.

During some benchmarks we did recently we observed extremely high response times even for searches that had very few results. After some code reviews and more fine-grained benchmarks, we realized that over 60% of the time that it took for a search to complete was due to permission checks in the database that stores the document metadata. An example search for the term 'java', returned from solr something in the area of 2000 results. After that, each result should have its permissions checked to see if the user that did the search has read permission on the result and filter out results that cannot be read by the specific user. The response from solr was blazingly fast, the transformation of the SolrDocument objects to gss resources and the marshaling to json was around 40% of the total time and the remaining 60% was the permissions checking.

So, we thought that if the solr search is so fast, why don't we store the document permissions in the index and transform the search query to include the user? That way the search will return only the relevant results (those that the user has read permission) and no permission checking and filtering will be necessary. More specifically, whenever a file is created or updated we store in the index the user and group ids that have read permissions on the file. Now, when a user does a search, we retrieve the groups that the user belongs to and append to the search query a search term that checks if the user id and group ids belong to those stored with the file. That way the search returns only the relevant results, thus improving search times more that 60%.

Note: Care should be taken to update the index, not only when a file is updated but whenever its permissions are updated too. However, this is not something that happens often and index updating is done asynchronously through a message queue, so the load imposed to the server is insignificant.

Friday, March 25, 2011

My Google Web Toolkit Talk

If you are new to GWT technology or have never heard about it, take a look at my recent talk about Google Web Toolkit during the 3rd Greece GTUG (Google Technology User Group) Meetup. It is quite introductory, highlighting the main features of the toolkit. I am planning (if time permits) to add the transcript as well, because some slides are not self-explanatory. Your comments are always welcome.

Saturday, November 13, 2010

Inter-operation of Mercurial and Git

Recently, I had to incorporate into mynetworkfolders various fixes from the open-source gss base project. Unfortunately, when the initial version of mynetworkfolders was created, we didn't clone from gss and now we have two completely independent trees, one in mercurial (gss) and one in git (mynetworkfolders). Fortunately, there is a way to link the two repos, so that changes can easily be migrated from one to the other.

First of all, we need fast-export.

git clone git://repo.or.cz/fast-export.git

then create a new git repo that we 'll use to convert the mercurial repo into.

git init gss_git_repo
cd gss_git_repo
../fast-export/hg-fast-export.sh -r ../gss
git checkout HEAD

Now, we have converted the gss mercurial repo into the gss_git_repo Git repo.

Then, we fetch changes from this repo into the mynetworkfolders git repo.

cd ../mynetworkfolders
git remote add gss ../gss_git_repo
git fetch gss [gss_branch:new_branch]

The last option is not mandatory and is needed only if we want to fetch a particular branch from the gss_git_repo.
Now in the mynetworkfolders git repo we have a new branch containing the gss changesets. If we do a merge, all changes from the beginning of the two projects will be merged so we have a lot of work to do to keep only those changes needed. However after this step and a commit, a common node will be created in the mynetworkfolders repo, so from now on if we repeat the above procedure, only the new changes from gss will be fetched into mynetoworkfolders.

Saturday, October 23, 2010

Yet another how to convert an svn repo to git

If you don't have branches in your svn repo then converting to git is an easy task. Just use git svn and you are done. I not only had branches but I also had multiple projects in the same svn repo and didn't want to move all of them to git. So the first step is to use git svn but explicitly specify where the trunk, branches and tags are for the particular project I wanted to move.

git svn clone http://path-to-root-of-svn-repo -A authors.txt -t tags/myproject -b branches/myproject -T trunk/myproject myproject


Notice that under the root of the svn repo, I have trunk, branches and tags folders. Each project has its trunk under trunk/project, its branches under branches/project and its tags under tags/project. The first argument to the command above is the root of the svn repo and trunk, branches and tags of myproject are given as -T, -b and -t parameters. Nothing else worked correctly for my case. Notice also that you need an authors.txt file of the form

username = Firstname Lastname <email>

with all users that have committed in the svn repo. The process will abort if it finds a name that is not in the authors file and you will have to add it and run the command again.

After the command finishes (and it might take a while) we have our git repo under myproject. There is one problem though (actually they are two): All svn tags and branches are now git remote tracking branches. If we just push our git repo to a remote hosting site like github we 'll have only the master branch (previously svn trunk). So we have first to convert all previous svn tags to real git tags and the then push everything (master, branches and tags) to github.

git branch -r
will list all remote tracking branches in the newly created git repo. Those with name of the form tags/tagname are coming from svn tags. As Paul Dowman explains in his post, for each one we have to create a git tag
git tag tagname tags/tagname
and then delete the remote tracking branch
git branch -r -d tags/tagname

The script I wrote to do the conversion is


git branch -r |awk -F "/" '$2 {printf("git tag %s tags/%s\n", $2, $2)}' |sh
git branch -r |awk -F "/" '$2 {printf("git branch -r -d tags/%s\n", $2)}'|sh


After that step we connect the local repo with github
git remote add origin git@github.com:user/myproject.git
and push but don't forget --tags otherwise the tags won't be pushed.
git push origin master --tags
and push again with --all to push the branches
git push --all

Thursday, September 16, 2010

Testing GWT Designer

Google announced today on the GWT blog the availability of GWT Designer among other things coming from the Instantiations acquisition. I couldn't help trying it, of course, so I installed the Eclipse plugin and tried to add a DialogBox into an existing project of mine. The plugin created the skeleton code for the new class but when I clicked on the design tab to see the graphical editor, Eclipse crashed miserably. It just disappeared from my screen with nothing written in the log (how typical). I tried updating everything related to Eclipse and test again with no luck. I then tried to create a new project from the beginning with the same results. Google search didn't help either. Tomorrow I 'll try it on the office computer and see what happens.

Sunday, September 5, 2010

Nodify made it to No 5 in the utility category!

Nodify is our entry in the Node Knockout coding contest. It is a Web-based IDE for writing nodeJS applications in Javascript. Initialy, it was meant to be only for server-side apps that implemented a REST-like API but during the 48-hour contest, we decided that it is better for now to have a more general approach.
The results came out yesterday and we are No 5 in the utility category!!! Apart from that, we had some very encouraging comments from the voters and that is even more satisfying than the ranking alone. Anyway, Panagiotis says it all in the video, you can check the project itself or download the code to play with. It is open-sourced under the MIT Lincence.

...and thanks to all that voted for us.