SharePoint 2010 a Better Investment for Developers than LAMP

Time is Money

Currently, one of the most popular web development platforms is LAMP. LAMP stands for Linux, Apache, MySQL, and PHP. The reason why it is used by so many is that the whole stack is free. Well, sort of... There's a lot of time that gets wasted in the development process for this platform; the tools just simply aren't there.

Fortunately, if you agree that your time is not worthless, you might invest in something like ASP.NET and many of your development pains go away. Unfortunately like LAMP, without a good framework, a good understanding of database design, and great security principles, you're going to be reinventing a wheel which has been picking up steam for a few years now.

The Wheel

SharePoint offers a framework that abstracts away databases and security design, it gives you a complete content management system, a plug-in architecture, and for the price, unmatched scalability. This doesn't even begin to take into account the growing ecosystem of third party features available both for free and fee. The one piece that is missing is developer tools.

The Tools

SharePoint wasn't even given the time of day when Visual Studio 2008 was being put together. Much of the development process made Visual Studio look like an over-glorified version of Notepad. The same cannot be said for Visual Studio 2010. Project templates, build processes, enhanced IntelliSense, wizards, and probably a unicorn or two make developing for SharePoint a reasonable task. What was once a six-step process to deploy and test a solution, is now one button: F5. Deployment speed didn't improve, but at least I can get a can of Coke while my computer does it for me. No batch scripts necessary. More information about the tooling added to Visual Studio for SharePoint 2010 can be found in this post on the .NET Developer's Journal.

In the end, SharePoint 2010 offers a great application stack for developing some very compelling web products, and the tools that the Visual Studio team put together for the upcoming platform makes it very accessible for developers.

Filed under:
SharePoint Conference 2009 Take-Aways

Hive, Hive, Hive, <slap>, ... ok fine. 14 Root

For quite a while now, SharePoint developers have been referring to the 12 directory located on WebFronts as the "12 hive". Well, Microsoft no likey. It's now, at least by some, referred to as the "14 root".

RIP WSS, MOSS

In a great move, "Windows" and "Office" have both been removed from the nomenclature of SharePoint. To me, it made little sense to have this product straddling the line between these two brands, and with SharePoint 2010, there is enough backing behind it to have it stand on its own. Now referred to as SharePoint Foundation, the replacement for Windows SharePoint Services can still be installed as a free product. As your use of SharePoint expands, it can be upgraded to keep up and eventually become SharePoint Server (the successor to Microsoft Office SharePoint Server).

Client Object Model and REST

This is, by far, one of the things I am most excited about. Supporting WinForms/WPF, Silverlight, and JavaScript, this is going to be huge.

A Customizable Ribbon

The ribbon is beautiful and hugely customizable. Everyone that writes applications on SharePoint should use it, extend it, and embrace it. Unfortunately, one of SharePoint's best hand-holding features will make relying on it nearly impossible. SP2010, by default, installs in SP2007 UI mode... no ribbon. This is the right thing to do, but it certainly makes using the ribbon that much harder.

Yes, finally. It's finally there. Finally.

LARGE LIST THROTTLING

If you have a SharePoint solution that accesses list items in SharePoint 2007, I GUARANTEE it will NOT work in SP2010 unless you address this key change. In most development situations, it will not be immediately apparent either. Large lists are now supported in VS2010, but they are heavily throttled. Non-administrative users have limits defaulted at 5,000 items per request while admins can view up to 20,000. While CAML queries allow you to request specific limits, it is still likely that those requests will fail under these circumstances because SharePoint doesn't know for certain that it will be able to avoid looking at all items in the list anyway.

Expect more to come with a few in depth posts on the Client Object Model, REST, the Solution upgrade path, and Large List Throttling.

Filed under: ,
Stick-Figure Commentary on Why My Blog Has Been Quiet Lately

!SharePoint.Loves(me)

Events to Expect when Dynamically Loading iFrames in JavaScript - Take 2 (thanks Firefox 3.5)

Just two months ago, I wrote a post on the JavaScript events you could expect when loading iFrames dynamically. Since the behavior highly depends on the browser you are using and the whims of the developer who decided what to fire and when, I put together a nice table of Browser v. Content-Disposition listing the events and any return codes that come along with them.

This is where it all falls apart. With Firefox 3.5, it appears as though my table must change. Firefox no longer fires onload when the Content-Disposition is set to Attachment. This is the same behavior that can be expected in Safari and Chrome. Even though it appears as though there's a growing consensus outside of the IE camp, I do not think it's moving in the right direction. (IE's behavior is clearly the best here: more, specific, expected)

Disappointedly, I'm posting this updated table depicting the events you should expect when dynamically loading iFrames:

Thanks for nothing.

Filed under: ,

Browsers tend to do what they like with your static content: deciding what to cache and when to refresh. On the server-side, you can specify that certain file types have a designated TTL (time to live) and that works in many cases. There are other cases when you either don't have access to the server's configuration or you need to make sure everyone has the latest bits of JavaScript.

With every release of VizitSP, our SharePoint Document Viewer, we need to make sure that users get all of the changes we've made. To achieve this, there's a very easy trick that can keep our files cached exactly as long as they need to be and it involves le query string.

What the HTTP Caching Specification Says

ThinkVitamin posted an article about the correct way to pull off browser caching while playing by the rules. This is a great way of doing things; and, if you control the server your site is running on, please do that; however, if you're a component or an installable product like our Vizit, you might not be offered that liberty.

According [to] the letter of the HTTP caching specification, user agents should never cache URLs with query strings. While Internet Explorer and Firefox ignore this, Opera and Safari don’t - to make sure all user agents can cache your resources, we need to keep query strings out of their URLs.

Supporting only IE and Firefox didn't sound that attractive to me, so I decided to do a bit of research. Digging around the net, I found out that Safari stores its cache in

   1: C:\Documents and Settings\[username]\Local Settings\Application Data\Apple Computer\Safari\cache.db

It's a SQLite database, so a quick download of the free Sqlite3Explorer tool made querying this very easy.

   1: sqlite> select request_key from cfurl_cache_response LIMIT 735,5;
   2: http://graphics8.nytimes.com/js/app/lib/NYTD/0.0.1/tooltip.js
   3: http://graphics8.nytimes.com/images/global/icons/feed_icon_16x16.gif
   4: http://graphics8.nytimes.com/js/app/analytics/gw.js?csid=H07707
   5: http://graphics8.nytimes.com/js/app/analytics/revenuescience.js
   6: http://graphics8.nytimes.com/js/app/analytics/controller_v1.1.js

What do we have here? A static file (js extension) with a query string in Safari's cache.

This got me thinking about Opera and Chrome... luckily, they make it trivial to browse their cache using the about:cache URL. This is what I found:

chrome

opera

Cool huh? It appears as though IE, Firefox, Safari, Opera, and Chrome ignore this bit of the spec and I'm OK with that.

It's important that you update your version query string parameter with every release or this technique wont work. With Vizit, we do it as part of our build process. Everywhere in the code that includes a JavaScript file we append '?ver=VIZIT_VERSION'. At build time, we sift through all the files and replace that with the version number of our assemblies. Very nice indeed.

Results

On its first load with an empty cache, an upcoming release of VizitSP weighs in at about 420 KB. That small size is achieved through CSS image spriting, JavaScript compression, concatenation, and the SharePoint Web Front Ends gzipping it all; however, a second load will see that size drop to a measly 75 KB. Compare that to the New Your Times homepage which weighed in at 763 KB on first load, only dropping to 370 KB after cached. While the New York Times is a news site which means much more of it needs to be refreshed all the time, taking advantage of the fact that Vizit can update at will makes for a very nimble and fast-loading web application.

Visual Studio 2008 Intellisense for Ext JS

Ever thought it would be totally boss to have JavaScript Intellisense for Ext JS in Visual Studio 2005? Too bad. If you were talking about VS 2008, though, I've got your back.

Intelli-Referencing External JavaScript

VizitSP is a fairly large project and much of that heft is on the client side. With that, we have a lot of JavaScript to manage. One of the techniques we use to help ease the burden is to maintain separate files for each JS class and then merge and minify them all together during the build process. Unfortunately, splitting the files up sacrifices all the Intellisense we would get for free if we hadn't. Not exactly.

Scott Guthrie posted this article about JavaScript Intellisense in Visual Studio back in 2007. While going over some of the basics, he also talks about how to reference external JS files using an XML comment at the top of your script.

   1: /// <reference path="path/to/script.js" />

They even load references of references. That means that when I reference the script I have located at "path/to/script.js" that has an XML reference comment pointing at "path/to/otherScript.js", my script gets Intellisense for script.js and otherScript.js! I'm easily amused.

Back in November, Visual Studio Web Tools Program Manager, Jeff King, posted this FAQ on JScript Intellisense in Visual Studio 2008 that you should take a look at as well.

Give me Ext JS

This is all well and good, but what I really want is all of Ext JS at my fingertips. I want to be able to get some Intellisense love for the entire API. My first thought was to just add a reference comment pointing to ext-all-debug.js and maybe ext-base.js.

Long story short, that blew up in my face and was way too big of an undertaking to deal with. I began stripping out any incompatible bits of code and got nowhere fast. A little searching on the net led me to find people out there who clearly have way more time on their hands need for this than I do. While there are a few forum posts on the topic with supposed links to working script, this is the only one that didn't return a 404.

While it's not entirely there, it's pretty close. Some classes exist in multiple places in Ext's namespace as a convenience and while the object itself appears in the Intellisense, a constructed version of it does not have any members or methods. I tend to use Ext.form.BasicForm over Ext.BasicForm... so, I'm out of luck. However, it would not take much to add it.

Flash, Silverlight just Stop-Gaps

The F.U.D.

There has been a lot of talk in the past few weeks about HTML 5 and its supposed takeover on the web (or lack thereof). It's only a matter of time before Flash and Silverlight become less relevant. The CEO of Adobe, Shantanu Narayen, defended their position in the future of the web during the company's last quarterly financial call, saying:

So [, we're] clearly supportive in terms of making sure as HTML 5 is evolving that we will support it in our web authoring tools but from the perspective of continuing to drive Flash and innovation around Flash and rich Internet applications, we still think that actually the fragmentation of browsers makes Flash even more important rather than less important.

The part of the question that he is dodging here is how a more powerful and more broadly supported standard will affect Flash in the long term.

Recently, InfoWorld posted an article about this very topic. It's a good read and the author gets a lot of things right. He talks about how IE will be a critical milestone for HTML 5 to overcome:

The exception is Microsoft, which therefore is in a difficult situation, says Almaer. The company has heavy investments in trying to propel Silverlight to dominance. "That's a big elephant in the room for them because you can imagine the Silverlight team [whose] whole existence is to add [this] functionality in. [But] if Internet Explorer puts it already in there, why do we have Silverlight?" he asks.

Good question. What incentive does Microsoft have, from a Silverlight perspective, to further the advancement of HTML 5? When IE adds support for all of the capabilities that Silverlight has to offer directly into the browser, does Silverlight still need to exist? No, it doesn't. It's arguably in Microsoft's best interest to keep its feet stuck in the mud and continue to not be a 'standards-based' browser.

The Push

This roadblock is ever more apparent with the announcement of Firefox 3.5 RC. It, along with Safari and Chrome, already support much of what HTML 5 has coming up in the pipeline. Internet Explorer is the only major player that is mum on the subject.

Fortunately, the push for more advanced capabilities in the HTML standard is so great that some of IE's shortcomings can be sidestepped. There is a project put together by Google on SourceForge called ExplorerCanvas. It allows for 2D command-based drawing. Browsers like Safari, Firefox, Chrome and Opera all support the HTML5 canvas... but Internet Explorer does not. The ExplorerCanvas project overcomes this by taking Microsoft's Vector Markup Language and wrapping it in JavaScript to create the desired effect. The DotImage Web Annotation Viewer that we use in our SharePoint Document Viewer, VizitSP, employs the ExplorerCanvas to let users author annotations in a very natural way.

Another example of this push is Kroc Camen's recent publication of Video for Everyone. By using the upcoming <video> tag along with Flash and Quicktime object tags, he is able to provide a series of fallbacks that can offer HTML5 love to people who deserve it while providing an acceptable experience for those who don't.

One area where I think InfoWorld gets it wrong is in its comment about Google's possible apprehension toward pushing the new standard:

..its YouTube subsidiary uses Flash for its video, but the inclusion of HTML 5 capabilities in browsers might cause YouTube to rethink that decision, notes Fette. "It's a cost/benefit analysis that they'd need to make."

For a couple months now, YouTube has had an HTML 5 demo available on its site. The example shows that, with a capable browser, you can view video without any codecs or plugins. It just works. YouTube has clearly already begun to invest the time into moving away from Flash as soon as it's feasible. Without much effort, YouTube could begin serving up HTML5 video to those browsers that support it. To push the envelope even further, YouTube could add features to the HTML5 version that are either not possible to do in Flash or just as an incentive to get people to use better browsers... or to give IE a hint.

The Point

The point here is that we can't allow such a fundamental piece to the future of the Internet rely on plugins. Something that is so core to the direction that the web is vectored toward deserves to be treated as such: core. While it might take us a while to get there, we will --even if it means dragging the old browsers kicking and screaming behind us. Until then, Flash and Silverlight can rule the roost.

Filed under: , ,

NOTE: An update to this article has been posted and can be found here - Take 2 (thanks Firefox 3.5)

Almost every JavaScript project starts off with a conversation about browser compatibility. As much as we'd like to think that JavaScript is JavaScript, each browser has its own implementation; and depending on what you're trying to do, those implementations could be drastically different from one another.

This case of inconsistent JavaScript behavior can be found in the events you can expect to receive during and after the dynamic loading of iFrames.

After quite a bit of testing and head scratching, we were able to come up with the following table of events to expect when dynamically loading content into an iFrame:

What is Content-Disposition?

Sometimes when you provide your users with a file download link from a web page, you might not be linking them directly to the file, but to an ASPX page or PHP script instead, which streams the content of the file back to the client. In order for your browser to know what to do with this content, you provide header information with the response including Content-Type and Content-Disposition.

Content-Type is exactly what it sounds like: "application/pdf" if the content is PDF, "image/jpeg" if the content is JPEG, and so on. Content-Disposition refers to how you want the browser to handle the file after it is downloaded. It comes in a few flavors, but these are the ones that matter for this article: "inline" where the browser will attempt to load it (or an application that can load it) in the browser window or frame specified, and "attachment" where the browser will prompt the user to download.

Table Breakdown

The table above is for quick reference. I'm sure I'm going to need to Google for this answer in the future, so, "You're welcome, future-self." Here's a breakdown of each browser and what events should be expected and when they fire.

Safari / Chrome

Attachment: ... thanks for nothing. Neither of these browsers provide any events for iFrames loaded with content that have an "Attachment" disposition type.

Firefox

Inline: This behaves identically to Safari / Chrome in the Inline case.

Attachment: This also behaves identically to Safari / Chrome in the Inline case.

IE (surprisingly gets my vote of approval here) There is an onreadystatechange event that fires whenever the iFrame's readyState property changes. That readyState reflects where the download is in the process.

Inline: When you initially set the src value of the iFrame element, the readyState changes to loading. When the file has completely downloaded, the readyState changes to interactive. The big difference between IE and the other browsers is that IE then changes the readyState property to complete when the page (or application) is fully loaded and ready for the user.

Attachment: This behaves identically to the Inline case of IE, but the readyState property never changes to complete. That wouldn't make much sense, since the user has to manually open the file by double-clicking on it or opening  it from some application.

Final Thoughts

As you can see, this can be quite a pain; but with a little bit of research and knowing exactly what you want your customers to experience, you can have a fairly consistent process for all browsers.

Using YUI Compressor with eval

YUI Compressor is an excellent JavaScript minifier and obfuscator. But every good thing comes with its compromises. While this tool prides itself on getting some of the best minification numbers I've seen, it's also very reliable. Your code gets smaller but will almost definitely work the same. While a few tools out there find trouble with eval and with, YUI Compressor gets around it by skipping it completely. It refuses to touch any code that is within the scope of an eval or with call. That might produce highly reliable, minified code, but in some cases, the code will just be reliable... unminified code. Good but not great.

Basic Case

Below is the example we'll be working with. We'll use it to demonstrate how eval can get trapped inside a scope causing YUI Compressor to ignore it completely.

   1: var Test = new function() {
   2:     var _self = this;
   3:
   4:     function performOperation(scr) {
   5:         return "Result: " + eval(scr);
   6:     }
   7:
   8:     _self.Operate = function(x, y, operation) {
   9:         return performOperation(x + operation + y);
  10:     }
  11: }

Above, the eval can reference anything inside the test class. It's hard to see how YUI could screw this up, but if we added an internal variable and modified the performOperation method to make use of it, you could see that things might get sticky.

   1: var Test = new function() {
   2:     var _self = this;
   3:     var _internalVariable = 3;
   4:
   5:     function performOperation(scr) {
   6:         return "Result: " + eval("(" + scr + ") * _internalVariable");
   7:     }
   8:
   9:     _self.Operate = function(x, y, operation) {
  10:         return performOperation(x + operation + y);
  11:     }
  12: }

This makes it a little more obvious. If YUI Compressor had it's way with this and wasn't as cautious as it is, _internalVariable would be renamed to some arbitrary string of letters and the eval call would fail.

You can see this by running the compressor on our script. The following code is the result:

   1: var Test=new function(){var _self=this;var _internalVariable=3;
   2: function performOperation(src){return"Result: "+eval("("+src+") * _internalVariable")}
   3: _self.Operate=function(x,y,operation){return performOperation(x+operation+y)}};

By adding the requirement that eval can only be called on public facing methods, we are then able to pull it out of the Test object entirely.

   1: var Test = {
   2:     Eval: function(src) {
   3:         return eval(src);
   4:     }
   5: }
   6: Test.Operator = new function() {
   7:     var _self = this;
   8:
   9:     function performOperation(src) {
  10:         return "Result: " + Test.Eval(src);
  11:     }
  12:
  13:     _self.Operate = function(x, y, operation) {
  14:         return performOperation(x + operation + y);
  15:     }
  16: }

While this allows for much more minified JavaScript, it does limit us a bit much. We can remedy this by adding a scope variable to the Test.Eval signature, allowing us to pass in the appropriate scope.

   1: var Test = {
   2:     Eval: function(src, scope) {
   3:         return eval(src);
   4:     }
   5: }
   6: Test.Operator = new function() {
   7:     var _self = this;
   8:     _self.Variable = 3;
   9:
  10:     function performOperation(src) {
  11:         return "Result: " + Test.Eval("(" + src + ") * scope.Variable", _self);
  12:     }
  13:
  14:     _self.Operate = function(x, y, operation) {
  15:         return performOperation(x + operation + y);
  16:     }
  17: }
  18:

It might look a little confusing, but read it over; it's right.

Since YUI Compressor will not touch any code that has eval in its scope, we don't have to worry that the scope property will change.

   1: var Test={Eval:function(src,scope){return eval(src)}};
   2: Test.Operator=new function(){var a=this;a.Variable=3;
   3: function b(c){return"Result: "+Test.Eval("("+c+") * scope.Variable",a)}
   4: a.Operate=function(c,e,d){return b(c+d+e)}};

The only bit of code that is not minified is the Test.Eval method due to its use of eval. (return lines added for readability)

While this might not show much benefit as code-length savings goes, keep in mind that the example above is very small. Any reasonably long class will show huge improvements when minified properly; not to mention the benefits that are had from making your intellectual property less readable.

Real World Example

I will detail the following code in a future post, but for now, I wanted to show how the above technique of extracting eval calls out of a class can benefit JavaScript length after compression.

   1: var IAT = {};
   2: IAT.Eval = function(scr, scope) {
   3:     return eval(scr);
   4: }
   5: IAT.Importer = new function() {
   6:     var _self = this;
   7:     var _waitingClasses = [];
   8:
   9:     function processWaitingClasses() {
  10:         for(var i = 0; i < _waitingClasses.length; i++) {
  11:             var refs = _waitingClasses[i].refs;
  12:
  13:             var removeRefs = [];
  14:             for(var k = 0; k < refs.length; k++) {
  15:                 if(isDefined(refs[k]))
  16:                     removeRefs.push(k);
  17:             }
  18:
  19:             for(var k = removeRefs.length - 1; k >= 0; k--) {
  20:                 refs.splice(removeRefs[k], 1);
  21:             }
  22:
  23:             if(_waitingClasses[i].refs.length < 1) {
  24:                 var cls = _waitingClasses[i];
  25:                 _waitingClasses.splice(i, 1);
  26:                 setClass(cls);
  27:                 break;
  28:             }
  29:         }
  30:     }
  31:
  32:     function setClass(def) {
  33:         // 'scope' will be evaluated as def in IAT.Eval
  34:         IAT.Eval(def.clsName + '=scope.cls();', def);
  35:         processWaitingClasses();
  36:     }
  37:
  38:     function isDefined(str, scope) {
  39:         if(!scope) scope = 'window';
  40:         var path = str.split('.');
  41:
  42:         var scr = scope + '.' + path[0];
  43:         if(!IAT.Eval(scr)) return false;
  44:
  45:         if(path.length == 1) return true;
  46:
  47:         path.splice(0,1);
  48:
  49:         str = path.join('.');
  50:         if(path.length > 1) str = str.substring(0, str.length);
  51:
  52:         return isDefined(str, scr);
  53:     }
  54:
  55:     _self.Ns = function(str, scope) {
  56:         if(!scope) scope = 'window';
  57:         var path = str.split('.');
  58:
  59:         var scr = scope + '.' + path[0];
  60:         if(!IAT.Eval(scr)) IAT.Eval(scr + '={};');
  61:
  62:         if(path.length == 1) return;
  63:
  64:         path.splice(0,1);
  65:
  66:         str = path.join('.');
  67:         if(path.length > 1) str = str.substring(0, str.length);
  68:
  69:         this.Ns(str, scr);
  70:     }
  71:
  72:     _self.DefineClass = function(config) {
  73:         var parent = config.clsName.substring(0, config.clsName.lastIndexOf('.'));
  74:         _self.Ns(parent);
  75:         _waitingClasses.push(config);
  76:         for(var i = 0; i < config.refs.length; i++) {
  77:             _self.Import(config.refs[i]);
  78:         }
  79:         processWaitingClasses();
  80:     }
  81:
  82:     _self.Import = function(clsName) {
  83:         if(isDefined(clsName)) return;
  84:
  85:         var path = clsName.split('.');
  86:         path.splice(0,1);
  87:         var clsPath = path.join('/');
  88:         clsPath = clsPath.substring(0, clsPath.length);
  89:
  90:         var head = document.getElementsByTagName('head')[0];
  91:         var script = document.createElement('script');
  92:         script.setAttribute('type','text/javascript');
  93:         script.setAttribute('language','javascript');
  94:         script.setAttribute('src','Library/js/' + clsPath + '.js');
  95:         head.appendChild(script);
  96:     }
  97: }
  98:

(Before YUI Compressor)

   1: var IAT={};IAT.Eval=function(scr,scope){return eval(scr)};
   2: IAT.Importer=new function(){var a=this;var d=[];
   3: function c(){for(var j=0;j<d.length;j++){
   4: var h=d[j].refs;var l=[];for(var g=0;g<h.length;g++){
   5: if(b(h[g])){l.push(g)}}for(var g=l.length-1;g>=0;g--){
   6: h.splice(l[g],1)}if(d[j].refs.length<1){
   7: var f=d[j];d.splice(j,1);e(f);break}}}
   8: function e(f){IAT.Eval(f.clsName+"=scope.cls();",f);c()}
   9: function b(i,f){if(!f){f="window"}var h=i.split(".");
  10: var g=f+"."+h[0];if(!IAT.Eval(g)){return false}
  11: if(h.length==1){return true}h.splice(0,1);i=h.join(".");
  12: if(h.length>1){i=i.substring(0,i.length)}return b(i,g)}
  13: a.Ns=function(i,f){if(!f){f="window"}var h=i.split(".");
  14: var g=f+"."+h[0];if(!IAT.Eval(g)){IAT.Eval(g+"={};")}
  15: if(h.length==1){return}h.splice(0,1);i=h.join(".");
  16: if(h.length>1){i=i.substring(0,i.length)}this.Ns(i,g)};
  17: a.DefineClass=function(f){
  18: var h=f.clsName.substring(0,f.clsName.lastIndexOf("."));
  19: a.Ns(h);d.push(f);for(var g=0;g<f.refs.length;g++){
  20: a.Import(f.refs[g])}c()};a.Import=function(i){
  21: if(b(i)){return}var j=i.split(".");
  22: j.splice(0,1);var f=j.join("/");
  23: f=f.substring(0,f.length);var h=document.getElementsByTagName("head")[0];
  24: var g=document.createElement("script");
  25: g.setAttribute("type","text/javascript");
  26: g.setAttribute("language","javascript");
  27: g.setAttribute("src","Library/js/"+f+".js");h.appendChild(g)}};
  28:

(After YUI Compressor - Return lines added for readability - Over a 25% savings!)

The character length drops from 1788 before compression to 1317 after. Any additional code that is added to this class not containing eval will only increase the savings percentage; making your users' downloading experience that much snappier!

Filed under: , ,
Changing Maximum Upload Size in WSS 3.0 and MOSS 2007

Looking around the web, I found many articles describing how to do change the Maximum Upload Size in SharePoint, but for whatever reason, the options in the instructions were not available on my SharePoint installation. Blog posts by Joel Oleson on File Name, Length, Size and Invalid Character Restrictions and Recommendations and this one by Ronnie Guha mention the "Configure Virtual Server Settings" area on the "Virtual Server List" page in Central Admin. Neither of these existed. After some poking around, here is the correct navigation for WSS 3.0 and MOSS 2007. I'm not sure when in the builds of WSS that this change occurred, so follow the set of directions apply to the options you have available.

2. Open the Application Management tab.
3. Choose Web Application General Settings (take note that the url is /_admin/vsgeneralsettings.aspx - that appears to prove its legacy in Virtual Server).
4. Scroll down until you find the section labeled Maximum Upload Size.
5. Change away.

There is an element in the web.config called httpRuntime. The maxRequestLength attribute (defaulted to 51200) is traditionally used to change the maximum file upload size in ASP.NET web applications, but does not appear to have any effect on SharePoint.

I highly recommend you take a look at Joel's post on this topic to get an idea of how you should set that value based on your needs, but these instructions should take you the rest of the way.

Filed under: , ,
SharePoint Best Practices - San Diego

I have to start off by saying that this was the best conference I've been to. Maybe that doesn't say much since I've only been to a handful, but it has topped them all. Last year, I attended conferences such as AJAX World and Microsoft's ReMix in Boston and they were OK at best. The SharePoint Best Practices show brought many of the industry leaders together in an effort to spread knowledge to a broad range of people in a broad range of sectors. From developers and system integrators to project managers and CIOs, SharePoint Best Practices didn't just sell SharePoint to people (I think its user base is beyond that); it taught them how to use it, why they'd want to, and what to do to get the most out of it without going insane. SharePoint is a great platform and I look forward to my continued work on it.

My Agenda (Notable sessions)

My goal at this show was to learn as much as possible about deployment, permissions, testing, and WebParts (namely "What can I do with a WebPart that doesn't seem normal?"). Every session I went to was fantastic, so I'll only highlight the ones that stood out.

Best Practices for Developing Web Parts : Todd Bleeker

While I was at this presentation, I decided to open an IM window and beam all of my notes about the talk to my colleague, Dave Terrell. By the end of our conversation, we ended up buying a few copies of Todd's book, Developer's Guide to the Windows SharePoint Services v3 Platform.

Todd covered everything from packaging the WebPart into a solution as part of a feature to some of the cool things you can do with invisible web parts (I'll be exploring this more in a future post).

Automating your test environments for SharePoint development using Hyper-V : Ben Robb

Side note: On the last night, we ended up going to SharePoint by Day SharePint by Night and I played pool with Ben. I must confess that I am absolutely terrible at pool. It didn't help that Ben is quite good at the game. Maybe I was imagining it, but I think heard him say he played snooker semi-professionally. It was a good time.

During his talk, Ben discussed automating the provisioning of new SharePoint virtual machines for integration testing and development. One thing that we'll be adopting from this talk is the idea of a-new-machine-a-week. In this setup, every developer gets a fresh machine every Monday. This way, we know the machine is clean and there's no cruft that could cause issues in development.

Best Practices for Disposing SharePoint Objects : Todd Bleeker

Besides going over the basics of properly disposing your SPWeb and SPSite objects that you create, Todd highlighted a new tool from Microsoft called SPDisposeCheck. It has been less than 2 business days since I left San Diego and it is already part of our build process here. Hopefully Microsoft will continue to improve the tool (namely fix the -xml output parameter). It's a great find.

Best Practices for Unit testing on SharePoint : Francis Cheung

In an earlier talk in the conference, Francis discussed abstracting all SharePoint object model calls into Repository classes that made code easier to read and allowed your business logic to be tested outside of SharePoint. During this presentation, Francis explained how to test those repositories using TypeMock for SharePoint. It's a pay-for product, but is really the only way to get the job done (outside of mocking all of WSS yourself).

Secure Coding Practices for the Administrator : Maurice Prather

This talk should have been called, "How to determine whether or not your third party solution provider has good SharePoint programming practices". Maurice discussed how developers should be using CAS policies to specify what level of security their assemblies require, when to ask your vendor for more information about their GAC-deployed assemblies, and much more.

Using Elevated Privileges and Impersonation : Paul Schaeflein & Maurice Prather

Paul started off by discussing what it means to "run with elevated privileges" and its differences with "impersonating as system account (or other users)". I will post more on this topic in the coming days. These two concepts are very different and depend on what you are looking to do.

More to Come

This was an excellent conference and I got a lot out of it. This will surely shape future development. I will be posting more on each of these topics with how-tos and the like. If you want more information about any of these talks, post a comment to move them up my stack.

Installing SharePoint on Windows Vista

Check out this article on Bamboo Solutions to install Windows SharePoint Services 3.0 SP1 on Vista x64/x86.

We use that here in our office for demo laptops. It allows us to get the best performance when demoing our SharePoint products at shows by letting us run SharePoint directly on the hardware instead of in a VM. This is definitely a case where two OSs are not better than one.

I don't recommend that you use this for your development setup. It is a good idea to develop against WSS since it locks you into a feature set that makes your code deployable to both WSS and MOSS. But by installing SharePoint in a VM, you can roll back with snapshots. In this sort of Vista+SharePoint environment, if SharePoint bombs, you have little luck getting back without a system restore.

Filed under: ,
Hristo Pavlov's Blog on AllowUnsafeUpdates in SharePoint

This is a great post. A lot of good research was done to get to the bottom of this complicated topic.

Filed under:
SharePoint Saturday: Developing and Packaging a Third Party SharePoint Solution

Here are some notes from my presentation at SharePoint Saturday this past weekend in Virginia Beach, VA. Included are my slides, code that I demo'd, and hopefully answers to all of the questions that people asked.

The Slides

Download my slides as a QuickTime movie to view the presentation with all animations. Unfortunately, to keep the file size low, I had to sacrifice a lot of quality. I suggest you download both the PDF and movie to get the full effect.

The Code

The Feature that we worked on uses a toolkit called DotImage. If you're interested in using the feature for testing, feel free to get an evaluation license.

SPWebConfigModification

Here are links to Dave Terrell's 6 part series on using the SPWebConfigModification object to edit the web config of a given Web Application in SharePoint.

WSPBuilder

Check out my short blog post on WSPBuilder with instructions on how to use it in a NAnt build script.

SharePoint Solution Installer

I will post a more extensive article with instructions on how to use this product and some ideas on how it might be improved, but for now, you can find the SharePoint Solution installer on CodePlex.

My Presentation Setup

Here's a little more detail on the setup I used in my presentation: the computers I was using, which ones were VMs, how I switched between them and my slides, what presentation software I used, and how I tied it all together.

SharePoint Box

Since SharePoint is a bit of a beast, I decided to run it on an external machine. It was running on a Sony Viao with Vista. WSS 3.0 was installed directly on Vista. I will post on that in a week or so explaining how that's possible with steps on getting it done yourself.

Visual Studio Machine

I prefer to run Visual Studio on a different machine that I run SharePoint. The primary reason for this is so that I can quickly revert the VM that's running SharePoint in my development cycle without worrying about losing work.

For my presentation, I ran Visual Studio on a virtual machine in VMWare Fusion for Mac running Windows XP Pro.

Tying them Together

Since there was no guarantee of wireless offered at the Advanced Technology Center, I made sure I was all set by hosting a wireless network from my Mac and connecting to it from the Vista laptop. From there, I was able to remote in and get to SharePoint.

Presentation Software

I enjoy using Apple's Keynote software. It lets me make very attractive slides and is very easy to use. Plus, it exports to many formats that anyone can read without losing anything like transitions or animations.

Switching between Everything

To switch between my slides and my 2 demo machines, I used a built-in Mac feature called Spaces. It's a beautiful implementation of multiple-desktops.

WSPBuilder: Generate SharePoint Solution Files

There's a project on CodePlex called WSPBuilder. It's a console application that gets you from a given folder structure to a SharePoint Solution file. It would be nice if it was a NAnt / MSBuild task, but since it's an exe, calling it from your build script is trivial already.

Folder Structure

12
    TEMPLATE
        CONTROLTEMPLATES
        FEATURES
        LAYOUTS
        XML
    (anything you want to go in the 12 hive. just add any other folders)
GAC
    (any assemblies you would like deployed to the GAC)
80
    BIN
    (anything you want to go in the web application's virtual directory)
solutionid.txt

Sample Console Call from NAnt

<exec program="C:\Path\to\WSPBuilder.exe" commandline="-DLLReferencePath GAC -WSPName OutputSolutionFileName.wsp -TraceLevel Verbose" workingdir="C:\Path\to\Solution\Folder\Structure" />