Technology Status Update 2016

by Jon Davis 10. July 2016 09:09

Hello, peephole. (people.) Just a little update. I've been keeping this blog online for some time, my most recent blog entries always so negative, I keep having to see that negativity every time I check to make sure the blog's up, lol. I'm tired of it so I thought I'd post something positive.

My current job is one I hope to keep for years and years to come, and if that doesn't work out I'll be looking for one just like it and try to keep it for years and years to come. I'm so done with contracting and consulting (except the occasional mentoring session on code mentor -dot- io). I'm still developing, of course, and as technology is changing, here's what's up as I see it. 

  1. Azure is relevant. 
    The world really has shifted to cloud and the majority of companies, finally, are offloading their hosting to the cloud. AWS, Azure, take your pick, everyone who hates Microsoft will obviously choose AWS but Azure is the obvious choice for Microsoft stack folks, there is nothing meaningful AWS has that Azure doesn't at this point. The amount of stuff on Azure is sufficiently terrifying in quantity and supposed quality enough to give me a thrill. So I'm done with hating on Azure, after all their marketing and nagging and pushing, Microsoft has crossed a threshold of market saturation that I am adequately impressed. I guess that means I have to be an Azure fan, too, now. Fine. Yay Azure, woo. -.-
  2. ASP.NET is officially rebooted. 
    So I hear this thing called ASP.NET Core 1.0 formerly known as ASP.NET 5 formerly known as ASP.NET vNext has RTM'd, and I hear it's like super duper important. It snuck by me, I haven't mastered it, but I know it enought to know a few things:
    • It's a total redux by means of redo. It's like the Star Trek reboot except it’s smaller and there are fewer planets it can manage, but it’s exactly like the Star Trek reboot in that it will probably implode yours.
    • If you've built your career on ASP.NET and you want to continue living on ASP.NET's laurals, now is not the time to master ASP.NET 1.0 Core. Give it another year or two to mature. 
    • If you're stuck on or otherwise fascinated by non-Microsoft operating systems, namely Mac and Linux, but you want to use the Microsoft programming stack, you absolutely must learn and master ASP.NET Core 1.0 and EF7.
    • If all you liked from ASP.NET Core 1.0 was the dynamic configs and build-time transpiles, you don't need ASP.NET Core for that LOL LOL ROFLMAO LOL LOL LOL *cough*
  3. The Aurelia Javascript framework is nearly ready.
    Overall, Javascript framework trends have stopped. Companies are building upon AngularJS 1.x. Everyone who’s behind is talking about React as if it was new and suddenly newly relevant (it isn’t new anymore). Everyone still implementing Knockout are out of the loop and will die off soon enough. jQuery is still ubiquitous and yet ignored as a thing, but meanwhile it just turned v3.

    I don’t know what to think about things anymore. Angular 2.0 requires TypeScript, people hate TypeScript because they hate transpilers. People are still comparing TypeScript with CoffeeScript. People are dumb. If it wasn’t for people I might like Angular 2.0, and for that matter I’d be all over AureliaJS, which is much nicer but just doesn’t have Google as the titanic marketing arm. In the end, let’s just get stuff done, guys. Build stuff. Don’t worry about frameworks. Learn them all as you need them.
  4. Node.js is fading and yet slowly growing in relevance.
    Do you remember .. oh heck unless you're graying probably not, anyway .. do you remember back in the day when the dynamic Internet was first loosed on the public and C/C++ and Perl were used to execute from cgi-bin, and if you wanted to add dynamic stuff to a web site you had to learn Perl and maybe find Perl pearls and plop them into your own cgi-bin? Yeah, no, I never really learned Perl, either, but I did notice the trend, but in the end, what did C/C++ and Perl mean to us up until the last decade? Answer: ubiquitous availability, but not web server functionality, just an ever-present availability for scripts, utilities, hacks, and whatever. That is where node.js is headed. Node.js for anything web related has become and will continue to be a gigantic mess of disorganized, everyone-is-equal, noisily integrated modules that sort of work but will never be as stable in built compositions as more carefully organized platforms. Frankly, I see node.js being more relevant as a workstation runtime than a server runtime. Right now I'm looking at maybe poking at it in a TFS build environment, but not so much for hosting things.
    I will always have a bitter taste in my mouth with node.js after trying to get integrated with Express and watching the whole thing just crumble, with no documentation or community help to resolve it, and this happened not just once on the job (never resolved before I walked away) but also during a code-mentor mentoring session (which we didn't figure out), even after a good year or so of maturity of the platform after the first instance. I still like node.js but will no longer be trying to build a career on it.
  5. Pay close attention and learn up on Swagger aka OpenAPI. 
    Remember when -- oh wait, no, unless you're graying, .. nevermind .. anyway, -- once upon a time something called SOAP came out and it came with it a self-documentation feature that was a combination of WSDL and some really handy HTML generated scaffolding built into web services that would let you manually test SOAP-based services by filling out a self-generated form. Well now that JSON-based REST is the entirety of the playing field, we need the same self-documention. That's where Swagger came in a couple years ago and everyone uses it now. Swagger needs some serious overhauling--someone needs to come up with a Swagger-compliant UI built on more modular and configurable components, for example--but as a drop-in self-documentation feature for REST services it fits the bill.
    • Swagger can be had on .NET using a lib called Swashbuckle. If you use OData, there is a lib called Swashbuckle.OData. We use it very, very heavily where I work. (I was the one who found it and brought it in.) "Make sure it shows up and works in Swagger" is a requirement for all REST/OData APIs we build now.
    • Swagger is now OpenAPI but it's still Swagger, there are not yet any OpenAPI artifacts that I know of other than Swagger. Which is lame. Swagger is ugly. Featureful, but ugly, and non-modular.
    • Microsoft is listed as a contributing member of the OpenAPI committee, but I don't know what that means, and I don't see any generic output from OpenAPI yet. I'm worried that Microsoft will build a black box (rather than white box) Swagger-compliant alternative for ASP.NET Core.
    • Other curious ones to pay attention to, but which I don't see as significantly supported by the .NET community yet (maybe I haven't looked hard enough), are:
  6. OData v4 has potential but is implementation-heavy and sorely needs a v5. 
    A lot of investments have been made in OData v4 as a web-based facade to Entity Framework data resources. It's the foundation of everything the team I'm with is working on, and I've learned to hate it. LOL. But I see its potential. I hope investments continue because it is sorely missing fundamental features like
    • MS OData needs better navigation property filtering and security checking, whether by optionally redirecting navigation properties to EDM-mapped controller routes (yes, taking a performance hit) or some other means
    • MS OData '/$count' breaks when [ODataRoute] is declared, boo.
    • OData spec sorely needs "DISTINCT" feature
    • $select needs to be smarter about returning anonymous models and not just eliminating fields; if all you want is one field in a nested navigation property in a nested navigation property (equivalent of LINQ's .Select(x=>new {ID=x.ID, DesiredField2=x.Child.Child2.DesiredField2}), in the OData result set you will have to dive into an array and then into an array to find the one desired field
    • MS OData output serialization is very slow and CPU-heavy
    • Custom actions and functions and making them exposed to Swagger via Swashbuckle.OData make me want to pull my hair out, it takes sometimes two hours of screaming and choking people to set up a route in OData where it would take me two minutes in Web API, and in the end I end up with a weird namespaced function name in the route like /OData/Widgets/Acme.GetCompositeThingmajig(4), there's no getting away from even the default namespace and your EDM definition must be an EXACT match to what is clearly obviously spelled out in the C# controller implementation or you die. I mean, if Swashbuckle / Swashbuckle.OData can mostly figure most of it out without making us dress up in a weird Halloween costume, surely Microsoft's EDM generator should have been able to.
  7. "Simple CRUD apps" vs "messaging-oriented DDD apps"
    has become the new Microsoft vs Linux or C# vs Java or SQL vs NoSQL. 

    imageThe war is really ugly. Over the last two or three years people have really been talking about how microservices and reaction-oriented software have turned the software industry upside down. Those who hop on the bandwagon are neglecting to know when to choose simpler tooling chains for simple jobs, meanwhile those who refuse to jump on the bandwagon are using some really harsh, cruel words to describe the trend ("idiots", "morons", etc). We need to learn to love and embrace all of these forms of software, allow them to grow us up, and know when to choose which pattern for which job.
    • Simple CRUD apps can still accomplish most business needs, making them preferable most of the time
      • .. but they don't scale well
      • .. and they require relatively very little development knowledge to build and grow
    • Non-transactional message-oriented solutions and related patterns like CQRS-ES scale out well but scale developers' and testers' comprehension very poorly; they have an exponential scale of complexity footprint, but for the thrill seekers they can be, frankly, hella fun and interesting so long as they are not built upon ancient ESB systems like SAP and so long as people can integrate in software planning war rooms.
    • Disparate data sourcing as with DDD with partial data replication is a DBA's nightmare. DBAs will always hate it, their opinions will always be biased, and they will always be right in their minds that it is wrong and foolish to go that route. They will sometimes be completely correct.

  8. Integrated functional unit tests are more valuable than TDD-style purist unit tests. That’s my new conclusion about developer testing in 2016. Purist TDD mindset still has a role in the software developer’s life. But there is still value in automated integration tests, and when things like Entity Framework are heavily in play, apparently it’s better to build upon LocalDB automation than Moq.
    At least, that’s what my current employer has forced me to believe. Sadly, the purist TDD mindset that I tried to adopt and bring to the table was not even slightly appreciated. I don’t know if I’m going to burn in hell for being persuaded out of a purist unit testing mindset or not. We shall see, we shall see.
  9. I'm hearing some weird and creepy rumors I don't think I like about SQL Server moving to Linux and eventually getting itself renamed. I don't like it, I think it's unnecessary. Microsoft should just create another product. Let SQL Server be SQL Server for Windows forever. Careers are built on such things. Bad Microsoft! Windows 8, .NET Framework version name fiascos, Lync vs Skype for Business, when will you ever learn to stop breaking marketing details to fix what is already successful??!
  10. Speaking of SQL Server, SQL Server 2016 is RTM'd, and full blown SSMS 2016 is free.
  11. On-premises TFS 2015 only just recently acquired gated check-in build support in a recent update. Seriously, like, what the heck, Microsoft? It's also super buggy, you get a nasty error message in Visual Studio while monitoring its progress. This is laughable.
    • Clear message from Microsoft: "If you want a premium TFS experience, Azure / Visual Studio Online is where you have to go." Microsoft is no longer a shrink-wrapped product company, they sell shrink wrapped software only for the legacy folks as an afterthought. They are hosted platform company now all the way. .
      • This means that Windows 10 machines including Nokia devices are moving to be subscription boxes with dumb client deployments. Boo.
  12. imageAnother rumor I've heard is that
    Microsoft is going to abandon the game industry.

    The Xbox platform was awesome because Microsoft was all in. But they're not all in anymore, and it shows, and so now as they look at their lackluster profits, what did they expect?
    • Microsoft: Either stay all-in with Xbox and also Windows 10 (dudes, have you seen Steam's Big Picture mode? no excuse!) or say goodbye to the consumer market forever. Seriously. Because we who thrive on the Microsoft platform are also gamers. I would recommend knocking yourselves over to partner with Valve to co-own the whole entertainment world like the oligarchies that both of you are since Valve did so well at keeping the Windows PC relevant to the gaming markets.

For the most part I've probably lived under a rock, I'm sure, I've been too busy enjoying my new 2016 Subaru WRX (a 4-door racecar) which I am probably going to sell in the next year because I didn't get out of debt first, but not before getting a Kawasaki Vulcan S ABS Café as my first motorized two-wheeler, riding that between playing Steam games, going camping, and exploring other ways to appreciate being alive on this planet. Maybe someday I'll learn to help the homeless and unfed, as I should. BTW, in the end I happen to know that "love God and love people" are the only two things that matter in life. The rest is fluff. But I'm so selfish, man do I enjoy fluff.  I feel like such a jerk. Those who know me know that I am one. God help me.

image  image
Top row: Fluff that doesn’t matter and distracts me from matters of substance.
Bottom row: Matters of substance.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5


ASP.NET | Blog | Career | Computers and Internet | Cool Tools | LINQ | Microsoft Windows | Open Source | Software Development | Web Development | Windows

Announcing Fast Koala, an alternative to Slow Cheetah

by Jon Davis 17. July 2015 19:47

So this is a quick FYI for teh blogrollz that I have recently been working on a little Visual Studio extension that will do for web apps what Slow Cheetah refused to do. It enables build-time transformations for both web apps and for Windows apps and classlibs. 

Here's the extension:  

Here's the Github:

Either link will explain more.

Introducing XIO (xio.js)

by Jon Davis 3. September 2013 02:36

I spent the latter portion last week and the bulk of the holiday fleshing out the initial prototype of XIO ("ecks-eye-oh" or "zee-oh", I don't care at this point). It was intended to start out as an I/O library targeting everything (get it? X I/O, as in I/O for x), but that in turn forced me to make it a repository library with RESTful semantics. I still want to add stream-oriented functionality (WebSocket / long polling) to it to make it truly an I/O library. In the mean time, I hope people can find it useful as a consolidated interface library for storing and retrieving data.

You can access this project here:

Here's a snapshot of the README file as it was at the time of this blog entry.

XIO (xio.js)

version 0.1.1 initial prototype (all 36-or-so tests pass)

A consistent data repository strategy for local and remote resources.

What it does

xio.js is a Javascript resource that supports reading and writing data to/from local data stores and remote servers using a consistent interface convention. One can write code that can be more easily migrated between storage locations and/or URIs, and repository operations are simplified into a simple set of verbs.

To write and read to and from local storage,

xio.set.local("mykey", "myvalue");
var value = xio.get.local("mykey")();

To write and read to and from a session cookie,

xio.set.cookie("mykey", "myvalue");
var value = xio.get.cookie("mykey")();

To write and read to and from a web service (as optionally synchronous; see below),"mykey", "myvalue");
var value = xio.get.mywebservice("mykey")();

See the pattern? It supports localStorage, sessionStorage, cookies, and RESTful AJAX calls, using the same interface and conventions.

It also supports generating XHR functions and providing implementations that look like:"mykey", "myvalue");
var value = mywebservice.get("mykey")(); // assumes synchronous; see below
Optionally synchronous (asynchronous by default)

Whether you're working with localStorage or an XHR resource, each operation returns a promise.

When the action is synchronous, such as in working with localStorage, it returns a "synchronous promise" which is essentially a function that can optionally be immediately invoked and it will wrap .success(value) and return the value. This also works with XHR when async: false is passed in with the options during setup (define(..)).

The examples below are the same, only because XIO knows that the localStorage implementation of get is synchronous.

Aynchronous convention: var val; xio.get.local('mykey').success(function(v) { val = v; });

Synchronous convention: var val = xio.get.local('mykey')();

Generated operation interfaces

Whenever a new repository is defined using XIO, a set of supported verb and their implemented functions is returned and can be used as a repository object. For example:

var myRepository = xio.define('myRepository', { 
    url: '/myRepository?key={0}',
    methods: ["GET", "POST", "PUT", "DELETE"]

.. would populate the variable myRepository with:

    get: function(key) { /* .. */ },
    post: function(key, value) { /* .. */ },
    put: function(key, value) { /* .. */ },
    delete: function(key) { /* .. */ }

.. and each of these would return a promise.

XIO's alternative convention

But the built-in convention is a bit unique using xio[action][repository](key, value) ("mykey", {first: "Bob", last: "Bison"}), which, again, returns a promise.

This syntactical convention, with the verb preceding the repository, is different from the usual convention of_object.method(key, value).


The primary reason was to be able to isolate the repository from the operation, so that one could theoretically swap out one repository for another with minimal or no changes to CRUD code. For example,

var repository = "local"; // use localStorage for now; 
                          // replace with "my_restful_service" when ready 
                          // to integrate with the server[repository](key, value).complete(function() {

    xio.get[repository](key).success(function(val) {


Note here how "repository" is something that can move around. The goal, therefore, is to make disparate repositories such as localStorage and RESTful web service targets support the same features using the same interface.

As a bit of an experiment, this convention of xio[verb][repository] also seems to read and write a little better, even if it's a bit weird at first to see. The thinking is similar to the verb-target convention in PowerShell. Rather than taking a repository and working with it independently with assertions that it will have some CRUD operations available, the perspective is flipped and you are focusing on what you need to do, the verbs, first, while the target becomes more like a parameter or a known implementation of that operation. The goal is to dumb down CRUD operation concepts and repositories and refocus on the operations themselves so that, rather than repositories having an unknown set of operations with unknown interface styles and other features, instead, your standard CRUD operations, which are predictable, have a set of valid repository targets that support those operations.

This approach would have been entirely unnecessary and pointless if Javascript inherently supported interfaces, because then we could just define a CRUD interface and write all our repositories against those CRUD operations. But it doesn't, and indeed with the convention of closures and modules, it really can't.

Meanwhile, when you define a repository with xio.define(), as was described above and detailed again below, it returns an object that contains the operations (get(), post(), etc) that it supports. So if you really want to use the conventional repository[method](key, value) approach, you still can!


Download here:

To use the whole package (by cloning this repository)

.. and to run the Jasmine tests, you will need Visual Studio 2012 and a registration of the .json file type with IIS / IIS Express MIME types. Open the xio.js.csproj file.


jQuery is required for now, for XHR-based operations, so it's not quite ready for node.js. This dependency requirement might be dropped in the future.

Basic verbs

See xio.verbs:

  • get(key)
  • set(key, value); used only by localStorage, sessionStorage, and cookie
  • put(key, data); defaults to "set" behavior when using localStorage, sessionStorage, or cookie
  • post(key, data); defaults to "set" behavior when using localStorage, sessionStorage, or cookie
  • delete(key)
  • patch(key, patchdata); implemented based on JSON/Javascript literals field sets (send only deltas)
// initialize

var xio = Xio(); // initialize a module instance named "xio"
xio.set.local("my_key", "my_value");
var val = xio.get.local("my_key")();

// or, get using asynchronous conventions, ..    
var val;
    val = v;

xio.set.local("my_key", {
    first: "Bob",
    last: "Jones"
}).complete(function() {
    xio.patch.local("my_key", {
        last: "Jonas" // keep first name
xio.set.session("my_key", "my_value");
var val = xio.get.session("my_key")();

.. supports these arguments: (key, value, expires, path, domain)

Alternatively, retaining only the xio.set["cookie"](key, value), you can automatically returned helper replacer functions:

xio.set["cookie"](skey, svalue)
    .expires( + 30 * 24 * 60 * 60000))

Note that using this approach, while more expressive and potentially more convertible to other CRUD targets, also results in each helper function deleting the previous value to set the value with the new adjustment.

session cookie
xio.set.cookie("my_key", "my_value");
var val = xio.get.cookie("my_key")();
persistent cookie
xio.set.cookie("my_key", "my_value", new Date( + 30 * 24 * 60 * 60000));
var val = xio.get.cookie("my_key")();
web server resource (basics)
var define_result =
    xio.define("basic_sample", {
                url: "my/url/{0}/{1}",
                methods: [ xio.verbs.get,, xio.verbs.put, xio.verbs.delete ],
                dataType: 'json',
                async: false
var promise = xio.get.basic_sample([4,12]).success(function(result) {
   // ..
// alternatively ..
var promise_ = define_result.get([4,12]).success(function(result) {
   // ..

The define() function creates a verb handler or route.

The url property is an expression that is formatted with the key parameter of any XHR-based CRUD operation. The key parameter can be a string (or number) or an array of strings (or numbers, which are convertible to strings). This value will be applied to the url property using the same convention as the typical string formatters in other languages such as C#'s string.Format().

Where the methods property is defined as an array of "GET", "POST", etc, for each one mapping to standard XIO verbs an XHR route will be internally created on behalf of the rest of the options defined in the options object that is passed in as a parameter to define(). The return value of define() is an object that lists all of the various operations that were wrapped for XIO (i.e. get(), post(), etc).

The rest of the options are used, for now, as a jQuery's $.ajax(..., options) parameter. The async property defaults to false. When async is true, the returned promise is wrapped with a "synchronous promise", which you can optionally immediately invoke with parens (()) which will return the value that is normally passed into .success(function (value) { .. }.

In the above example, define_result is an object that looks like this:

    get: function(key) { /* .. */ },
    post: function(key, value) { /* .. */ },
    put: function(key, value) { /* .. */ },
    delete: function(key) { /* .. */ }

In fact,

define_result.get === xio.get.basic_sample

.. should evaluate to true.

Sample 2:

var ops = xio.define("basic_sample2", {
                get: function(key) { return "value"; },
                post: function(key,value) { return "ok"; }
var promise = xio.get["basic_sample2"]("mykey").success(function(result) {
   // ..

In this example, the get() and post() operations are explicitly declared into the defined verb handler and wrapped with a promise, rather than internally wrapped into XHR/AJAX calls. If an explicit definition returns a promise (i.e. an object with .success and .complete), the returned promise will not be wrapped. You can mix-and-match both generated XHR calls (with the url and methods properties) as well as custom implementations (with explicit get/post/etc properties) in the options argument. Custom implementations will override any generated implementations if they conflict.

web server resource (asynchronous GET)
xio.define("specresource", {
                url: "spec/res/{0}",
                methods: [xio.verbs.get],
                dataType: 'json'
var val;
xio.get.specresource("myResourceAction").success(function(v) { // gets http://host_server/spec/res/myResourceAction
    val = v;
}).complete(function() {
    // continue processing with populated val
web server resource (synchronous GET)
xio.define("synchronous_specresources", {
                url: "spec/res/{0}",
                methods: [xio.verbs.get],
                dataType: 'json',
                async: false // <<==!!!!!
var val = xio.get.synchronous_specresources("myResourceAction")(); // gets http://host_server/spec/res/myResourceAction
web server resource POST
xio.define("contactsvc", {
                url: "svcapi/contact/{0}",
                methods: [ xio.verbs.get, ],
                dataType: 'json'
var myModel = {
    first: "Fred",
    last: "Flinstone"
var val =, myModel).success(function(id) { // posts to http://host_server/svcapi/contact/
    // model has been posted, new ID returned
    // validate:
    xio.get.contactsvc(id).success(function(contact) {  // gets from http://host_server/svcapi/contact/{id}
web server resource (DELETE)
web server resource (PUT)
xio.define("contactsvc", {
                url: "svcapi/contact/{0}",
                methods: [ xio.verbs.get,, xio.verbs.put ],
                dataType: 'json'
var myModel = {
    first: "Fred",
    last: "Flinstone"
var val =, myModel).success(function(id) { // posts to http://host_server/svcapi/contact/
    // model has been posted, new ID returned
    // now modify:
    myModel = {
        first: "Carl",
        last: "Zeuss"
    xio.put.contactsvc(id, myModel).success(function() {  /* .. */ }).error(function() { /* .. */ });
web server resource (PATCH)
xio.define("contactsvc", {
                url: "svcapi/contact/{0}",
                methods: [ xio.verbs.get,, xio.verbs.patch ],
                dataType: 'json'
var myModel = {
    first: "Fred",
    last: "Flinstone"
var val =, myModel).success(function(id) { // posts to http://host_server/svcapi/contact/
    // model has been posted, new ID returned
    // now modify:
    var myModification = {
        first: "Phil" // leave the last name intact
    xio.patch.contactsvc(id, myModification).success(function() {  /* .. */ }).error(function() { /* .. */ });
custom implementation and redefinition
xio.define("custom1", {
    get: function(key) { return "teh value for " + key};
xio.get.custom1("tehkey").success(function(v) { alert(v); } ); // alerts "teh value for tehkey";
xio.redefine("custom1", xio.verbs.get, function(key) { return "teh better value for " + key; });
xio.get.custom1("tehkey").success(function(v) { alert(v); } ); // alerts "teh better value for tehkey"
var custom1 = 
    xio.redefine("custom1", {
        url: "customurl/{0}",
        methods: [],
        get: function(key) { return "custom getter still"; }
    });"tehkey", "val"); // asynchronously posts to URL http://host_server/customurl/tehkey
xio.get.custom1("tehkey").success(function(v) { alert(v); } ); // alerts "custom getter still"

// oh by the way,
for (var p in custom1) {
    if (custom1.hasOwnProperty(p) && typeof(custom1[p]) == "function") {
        console.log("custom1." + p); // should emit custom1.get and

Future intentions

WebSockets and WebRTC support

The original motivation to produce an I/O library was actually to implement a WebSockets client that can fallback to long polling, and that has no dependency upon jQuery. Instead, what has so far become implemented has been a standard AJAX interface that depends upon jQuery. Go figure.

If and when WebSocket support gets added, the next step will be WebRTC.

Meanwhile, jQuery needs to be replaced with something that works fine on nodejs.

Additionally, in a completely isolated parallel path, if no progress is made by the ASP.NET SignalR team to make the SignalR client freed from jQuery, xio.js might become tailored to be a somewhat code compatible client implementation or a support library for a separate SignalR client implementation.

Service Bus, Queuing, and background tasks support

At an extremely lightweight scale, I do want to implement some service bus and queue features. For remote service integration, this would just be more verbs to sit on top of the existing CRUD operations, as well as WebSockets / long polling / SignalR integration. This is all fairly vague right now because I am not sure yet what it will look like. On a local level, however, I am considering integrating with Web Workers. It might be nice to use XIO to manage deferred I/O via the Web Workers feature. There are major limitations to Web Workers, however, such as no access to the DOM, so I am not sure yet.

Other notes

If you run the Jasmine tests, make sure the .json file type is set up as a mime type. For example, IIS and IIS Express will return a 403 otherwise. Google reveals this:


The license for XIO is pending, as it's not as important to me as getting some initial feedback. It will definitely be an attribution-based license. If you use xio.js as-is, unchanged, with the comments at top, you definitely may use it for any project. I will drop in a license (probably Apache 2 or BSD or Creative Commons Attribution or somesuch) in the near future.

Gemli.Data: Basic LINQ Support

by Jon Davis 6. June 2010 04:13

In the Development branch of Gemli.Data, I finally got around to adding some very, very basic LINQ support. The following test scenarios currently seem to function correctly:

var myCustomEntityQuery = new DataModelQuery<DataModel<MockObject>>();
// Scenarios 1-3: Where() lambda, boolean operator, method exec
var linqq = myCustomEntityQuery.Where(mo => mo.Entity.MockStringValue == "dah");
linqq = myCustomEntityQuery.Where(mo => mo.Entity.MockStringValue != "dah");
// In the following scenario, GetPropertyValueByColumnName() is an explicitly supported method
linqq = myCustomEntityQuery.Where(mo=>((int)mo.GetPropertyValueByColumnName("customentity_id")) > -1);
// Scenario 4: LINQ formatted query
var q = (from mo in myCustomEntityQuery
where mo.Entity.MockStringValue != "st00pid"
select mo) as DataModelQuery<DataModel<MockObject>>;
// Scenario 5: LINQ formatted query execution with sorted ordering
var orderedlist = (from mo in myCustomEntityQuery
where mo.Entity.MockStringValue != "def"
orderby mo.Entity.MockStringValue
select mo).ToList();
// Scenario 6: LINQ formatted query with multiple conditions and multiple sort members
var orderedlist = (from mo in myCustomEntityQuery
where mo.Entity.MockStringValue != "def" && mo.Entity.ID < 3
orderby mo.Entity.ID, mo.Entity.MockStringValue
select mo).ToList();

This is a great milestone, one I’m very pleased with myself for finally accomplishing. There’s still a ton more to do but these were the top 50% or so of LINQ support scenarios needed in Gemli.Data.

Unfortunately, adding LINQ support brought about a rather painful rediscovery of critical missing functionality: the absence of support for OR (||) and condition groups in Gemli.Data queries. *facepalm*  I left it out earlier as a to-do item but completely forgot to come back to it. That’s next on my plate. *burp*

Gemli.Data v0.3.0 Released

by Jon Davis 12. October 2009 03:57

I reached a milestone this late night with The Gemli Project and decided to go ahead and release it. This release follows an earlier blog post describing some syntactical sugar that was added along with pagination and count support, here:

Remember those tests I kept noting that I still need to add to test the deep loading and deep saving functionality? No? Well I remember them. I kept procrastinating them. And as it turned out, I’ve discovered that a lesson has been taught to me several times in the past, and again now while trying to implement a few basic tests, a lesson I never realized was being taught to me until finally just this late night. The lesson goes something like this:

If in your cocky self-confidence you procrastinate the tests of a complex system because it is complex and because you are confident, you are guaranteed to watch them fail when you finally add them.

Mind you, the “system” in context is not terribly complex, but I’ll confess there was tonight a rather ugly snippet of code to wade through, particularly while Jay Leno or some other show was playing in the background to keep me distracted. On that latter note, it wasn’t until I switched to ambient music that I was finally able to fix a bug that I’d spent hours staring at.

This milestone release represents 130 total unit tests since the lifetime of the project (about 10-20 or so new tests, not sure exactly how many), not nearly enough but enough to feel a lot better about what Gemli is shaping in itself. I still yet need to add a lot more tests, but what these tests exposed was a slew of buggy relationship inferences that needed to be taken care of. I felt the urgency to release now rather than continue to add tests first because the previous release was just not suitably functional on the relationships side.


Gemli.Data Runs On Mono (So Far)

by Jon Davis 20. September 2009 00:44

I was sitting at my Mac Mini to remote into my Windows 7 machine and I had this MonoDevelop 2.0 icon sitting there (I had installed it a couple days ago just to see it install and run) and I thought I should give Gemli.Data a quick, simple test run.

Screen shot 2009-09-20 at 12.38.02 AM

(Click to view)

Notice the “Application output” window. It werkie!!

I was concerned that Mono hadn’t been flushed out well enough yet to be stable with Gemli.Data, I do a lot of reflection-heavy stuff with it to get it to infer things, not to mention System.Data dependency stuff. But it seems to work fine.

This isn’t by any means a real test, more of a quick and dirty mini smoke test. I was very happy to see that it works fine, and my confidence in Mono just went up a notch.

Not shown in the screen shot, I expanded it a little tiny bit to have multiple records, then I added sorting in the query. Apparently Gemli doesn't have sort support in the MemoryDataProvider yet so I used LINQ-to-Objects. Yes, LINQ syntax worked in Mono, too! Nifty!

Introducing The Gemli Project

by Jon Davis 23. August 2009 21:11

I’ve just open-sourced my latest pet project. This project was started at around the turn of the year, and its scope has been evolving over the last several months. I started out planning on a framework for rapid development and management of a certain type of web app, but the first hurdle as with anything was picking a favorite DAL methodology. I tend to lean towards O/RMs as I hate manually managed DAL code, but I didn’t want to license anything (it’d mean sublicensing if I redistribute what I produce), I have some issues with LINQ-to-SQL and LINQ-to-Entities, I find nHibernate difficult to swallow, I have some reservations about SubSonic, and I just don’t have time nor interest to keep perusing the many others. Overall, my biggest complaint with all the O/RM offerings, including Microsoft’s, is that they’re too serious. I wanted something lightweight that I don’t have to think about much when building apps.

So as a project in itself I decided first to roll my own O/RM, in my own ideal flavor. Introducing Gemli.Data! It’s a lightweight O/RM that infers as much as possible using reflection, assumes the most basic database storage scenarios, and helps me keep things as simple as possible.

That’s hype-speak to say that this is NOT a “serious O/RM”, it’s intended more for from-scratch prototype projects for those of us who begin with C# classes and want to persist them, and don’t want to fuss with the database too much.

I got the code functioning well enough (currently 92 unit tests, all passing) that I felt it was worth it to go ahead and let other people start playing with it. Here it is!

Gemli Project Home: 

Gemli Project Code:

Gemli.Data is currently primarily a reflection-based mapping solution. Here’s a tidbit sample of functioning Gemli.Data code (this comes from the CodePlex home page for the project):

// attributes only used where the schema is not inferred
// inferred: [DataModelTableMapping(Schema = "dbo", Table = "SamplePoco")]
public class SamplePoco
    // inferred: [DataModelFieldMapping(ColumnName = "ID", IsPrimaryKey = true, IsIdentity = true, 
    //     IsNullable = false, DataType = DbType.Int32)] // note: DbType.Int32 is SQL type: int
    public int ID { get; set; }

    // inferred: [DataModelFieldMapping(ColumnName = "SampleStringValue", IsNullable = true, 
    //     DataType = DbType.String)] // note: DbType.String is SQL type: nvarchar
    public string SampleStringValue { get; set; }

    // inferred: [DataModelFieldMapping(ColumnName = "SampleDecimalValue", IsNullable = true, 
    //     DataType = DbType.Decimal)] // note: DbType.Decimal is SQL type: money
    public decimal? SampleDecimalValue { get; set; }

public void CreateAndDeleteEntityTest()
    var sqlFactory = System.Data.SqlClient.SqlClientFactory.Instance;
    var dbProvider = new DbDataProvider(sqlFactory, TestSqlConnectionString);

    // create my poco
    var poco = new SamplePoco { SampleStringValue = "abc" };

    // wrap and auto-inspect my poco
    var dew = new DataModel<SamplePoco>(poco); // data entity wrapper

    // save my poco
    dew.DataProvider = dbProvider;
    dew.Save(); // auto-synchronizes ID
    // or...
    //dew.SynchronizeFields(SyncTo.ClrMembers); // manually sync ID

    // now let's load it and validate that it was saved
    var mySampleQuery = DataModel<SamplePoco>.NewQuery()
        .WhereProperty["ID"].IsEqualTo(poco.ID); // poco.ID was inferred as IsIdentity so we auto-returned it on Save()
    var data = dbProvider.LoadModel(mySampleQuery);
    Assert.IsNotNull(data); // success!

    // by the way, you can go back to the POCO type, too
    SamplePoco poco2 = data.Entity; // no typecast nor "as" statement
    Assert.IsTrue(poco2.ID > 0);
    Assert.IsTrue(poco2.SampleStringValue == "abc");

    // test passed, let's delete the test record
    data.MarkDeleted = true; 

    // ... and make sure that it has been deleted
    data = dbProvider.LoadModel(mySampleQuery);


Gemli.Data supports strongly typed collections and multiple records, too, of course.

var mySampleQuery = DataModel<SamplePoco>.NewQuery()
var models = dbProvider.LoadModels(mySampleQuery);
SamplePoco theFirstSamplePocoEntity = models.Unwrap<SamplePoco>()[0];
// or.. SamplePoco theFirstSamplePocoEntity = models[0].Entity;

Anyway, go to the URLs above to look at more of this. It will be continually evolving.

Be the first to rate this post

  • Currently 0/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5

Tags: , , ,

C# | Open Source | Software Development | Pet Projects

Quickie Alarm

by Jon Davis 13. May 2009 23:39

I flew out to California on Friday night to visit my siblings, and when I arrived I rented a car and drove around in the middle of the night looking for a cheap place to sleep. After an hour of getting completely lost I finally found a Motel 6 (*shudder*), climbed into bed, and then realized that there’s no alarm clock in the room. Great.

So I threw together another alarm clock on my laptop. I went through the trouble of giving it a nice touch, so it took an hour or two rather than a minute or two, but it did the job and I was proud and still managed to sleep well.

And now I’m sharing it here online. This is freeware with source code included. (Uses free Developer Express controls.)


.. click ‘Go’ and ..



jQuery Has Won The 3+ Year Javascript Framework Battle (As Far As I'm Concerned)

by Jon Davis 28. September 2008 15:33

It's official. jQuery has become the new de facto standard for the web development community. By rolling jQuery in with Visual Studio and the ASP.NET core tools pipeline, a whole new precedent has been set in the software industry.

jQuery was already supported in many major IDEs, including Aptana Studio (which is built on Eclipse), but usually only sharing with other frameworks like prototype. But there are two IDEs that have pretty much ruled the software industry for the last several years: Visual Studio and Eclipse. Neither one has chosen any particular "favorite" Javascript open source framework. You usually get a bundle of different frameworks being supported or nothing at all (import something yourself or roll your own).

But Microsoft's decision to adopt a third party software framework, bundle it, and make it a foundational component of its own, is an earth-shaking paradigm shift. This is something that will turn the software industry on its head. There is a whole industry carved out from the trenches that Microsoft dug. Giving a third party framework the honor of being placed into the middle of it all and running half the show, so to speak, is absolutely breathtaking, a moment to be awed. Right now everyone should take a moment and let their mouths gape because this is just short of bizzare.

And I mean that with no pretentions. I'm not saying that "this is unlike Microsoft", although it is, because there really is no precedent for this. The only precedents I can think of have been support for open standards--support for HTML (Internet Explorer), HTTP, FTP (bundled in Explorer), the TCP/IP stack, OpenGL, keyboard/mouse standardization, compact disc file system support, and standard driver support. But all of those things have traditionally always had, with very few exceptions, a proprietary implementation of software of Microsoft's own making or bought out. Most of the exceptions come from third parties such as Intel, who licensed technology, which is not the same as bundling open source code.

jQuery is licensed on the MIT license. Microsoft will be a "normal" participant with the jQuery community just like anyone else; they will introduce ideas, report bugs, and propose bug fixes, but they will go through a QA and approval process just like everyone else.

The closest thing I can think of that even remotely equates to Microsoft getting this involved with and supporting of outsiders in the web community was back in the late 90s, when Microsoft got very involved with the W3C and helped shape the directions of Dynamic HTML and the DOM, not to mention their extensive involvement with the XML and then SOAP initiatives and the insanely detailed UDDI [dis]proving that followed. But once again, those are standards / protocols, not code. So even though Microsoft has done amazing shifts in supporting the open source communities with CodePlex (bravo!), I'm curious if this really is the first time, ever, that Microsoft has done this on behalf of their proprietary development platforms (Visual Studio, ASP.NET).

On a final note, I must say that I absolutely adore jQuery and what it does for web development. jQuery working with Microsoft's ASP.NET MVC and C# 3.0 w/ LINQ are all a match made in heaven. Knowing that Microsoft is going to build on top of jQuery is almost like getting a wonderful new programming language akin to C#, but built for the web. So really, my day just went from being depressed from the last week to being literally overjoyed like I just got engaged to marry someone or something.

Mac OS X 10.5 (Leopard) + VMWare Fusion + Mono = Bliss

by Jon Davis 17. May 2008 15:13

I have been using my new Mac Mini for less than 24 hours and it already looks like this:

In the screenshot I have VMWare Fusion with Unity enabled so that I have the Windows Vista Start menu (I can toggle off the Start menu's visibility from VMWare itself) and Internet Explorer 7. (I also have Visual Studio 2008 installed in that virtual machine). Next to Internet Explorer on the left is Finder which is showing a bunch of the apps I have installed, including most of the stuff at On the right I have MonoDevelop where I can write C# or VB.NET applications for the Mac, for Linux, or for Windows. And of course, down below I have the Dock popped up because that's where my arrow actually is.

I also, obviously, have an Ubuntu VM I can fire up any time I want if I want to test something in Linux. 

Mac OS X 10.5 (Leopard) comes with native X11, not out of the box but with the installer CD, and it's the first OS X build to do so (previous versions used or required XFree86).

This point in time is a particularly intriguing milestone date for the alignment of the moons and stars for blissful cross-platform development using the Mac as a central hub of all things wonderful:


  • X11 on Mac OS X 10.5
  • MonoDevelop 1.0 is generally gold (released, it's very nice)
  • System.Windows.Forms in Mono is API-complete
  • VMWare Fusion's Unity feature delivers jaw-dropping, seamless windowing integration between Windows XP / Vista and Mac OS X. And to make things even more wonderful, VMWare Fusion 2, which comes with experimental DirectX 9 support, will be a free upgrade.
  • For game developers, the Unity game engine is a really nice cross-platform game engine and development toolset. I have a couple buddies I'll be joining up with to help them make cross-platform games, something I always wanted to do. This as opposed to XNA, which doesn't seem to know entirely what it's doing and comes with a community framework that's chock full of vaporware. (But then, I still greatly admire XNA and hope to tackle XNA projects soon.)
  • The hackable iPhone (which I also got this week, hacked, and SSH'd into with rediculous ease), which when supplemented with the BSD core, is an amazing piece of geek gadgetry that can enable anyone to write mobile applications using open-source tools (I'd like to see Mono running on it). The amount of quality software written for the hacked iPhone is staggering, about as impressive as the amount of open source software written for the Mac itself. Judging by the quantity of cool installable software, I had no idea how commonplace hacked iPhones were.
  • Meanwhile, for legit game development, the Unity 3D game engine now supports the iPhone and iPod Touch (so that's where XNA got the Zune support idea!) and the iPhone SDK is no longer just a bunch of CSS hacks for Safari but actually binary compile tools.



Powered by BlogEngine.NET
Theme by Mads Kristensen

About the author

Jon Davis (aka "stimpy77") has been a programmer, developer, and consultant for web and Windows software solutions professionally since 1997, with experience ranging from OS and hardware support to DHTML programming to IIS/ASP web apps to Java network programming to Visual Basic applications to C# desktop apps.
Software in all forms is also his sole hobby, whether playing PC games or tinkering with programming them. "I was playing Defender on the Commodore 64," he reminisces, "when I decided at the age of 12 or so that I want to be a computer programmer when I grow up."

Jon was previously employed as a senior .NET developer at a very well-known Internet services company whom you're more likely than not to have directly done business with. However, this blog and all of have no affiliation with, and are not representative of, his former employer in any way.

Contact Me 

Tag cloud


<<  May 2021  >>

View posts in large calendar