Have you ever played poker? Do you like a full house? A royal flush? Those are amazing but the hand you're dealt is two 3's, a 5, a jack, and a 9. Software development and its future, often has speculation laced in but sometimes we need to deal with the hand we're dealt. As primarily a web developer these days and on and off over the last 15+ years, there has always been that "next thing in the pipeline that will save the horrors of web development." Sounds kind of like a gambler that says, "if I can just play one more hand, I'll win it all back and more!"
This all builds up to be a bit cynical, but the reality is web development is and always has been an ever changing, volatile platform for which to develop. There probably is no better group of people accustomed to change than your savvy modern day web developer. I don't think anyone should label web developers as 'in love' with how modern web development is implemented in general. It's a kludge of open source libraries and techniques that defines the very word 'web.' That being a tangled web of various libraries and techniques all pointing back to that love-hate relationship with JavaScript.
JavaScript in some respects is that mediocre poker hand we were dealt. However in the web world it's perceived weaknesses are at the same time it's strengths. Albeit a dynamic language that frustrates many, it's also flexible and powerful providing us with that "write once run anywhere" ability due to the fact of browsers all being able to interpret and run JavaScript.The language has IMO matured exponentially in the last 5-10 years, and with the addition of high level superset languages like TypeScript, working with JavaScript isn't the nightmare it once was.
However, you won't see any "I love JavaScript" bumper stickers on the back of my car. I spent years working in various languages like WinBatch, VB, VBA, VB.NET, and C# on the server or client. The reality is today those do us no good in web development on the client. Ah but you say, WebAssembly is here to save the day! Yeah I've heard this tune played before (cough, cough, Silverlight). C# running on the client, same business rules on the client and server, our woes are over, etc. etc.. Listen, I know Silverlight and WebAssembly are not the same at all. The only analogy I'm making is it's not the 1st time in history something has been deemed the "savior" to web development. Call WebAssembly 'x' for this purpose, and "project x will save the day!" In reality I'm actually quite optimistic about where WebAssembly is going to take us on the client. Imagine using C# for everything? That would be beautiful. For an optimistic view of WebAssembly and it's future with smart client development, check out Rocky Lhotka's post: A Bright Future for the Smart Client
Let's pump the brakes for a minute though. It's early 2018 and I'm expected to deliver today using the hand I'm dealt. In that case we are looking at a host of JS libraries and frameworks to help us build a responsive SPA when it comes to web development. I also have the option of using server-side heavy tech, but for the purpose of this post, I'll focus on the more mainstream approach these days using JS and building responsive JS apps. The bright side of all this is there isn't that "one way" to do development. I suppose that's a double edged sword, but the reality is it gives us options. Like ES6? Use it. Don't like JS? Use TypeScript? Like Angular? Nope. Use React or Vue. Like LESS? Nope. Use SASS. The options go on and on and on. I do agree in some ways this stinks. I often talk about the days early in my career where being really awesome at VB6 meant you could conquer the world, finish your job, and not have to worry about 1,000 surrounding technologies, languages, and libraries. I feel like times were a bit more straight forward. However back in 2002, it was accepted that everybody and their brother was using Windows + IE and that was it. Times have changed and so has the web. The expectation today is run anywhere. Thankfully, JavaScript has provided that flexible nature to allow use to be fluid and make our way to being platform and browser agnostic. Thus we can reach a lot of people with a single line of code.
So what does this all mean? Well I can say for one thing today: the web is a volatile space and likely to continue beings so in the near future. Even if WebAssembly or any other technology comes along and truly revolutionizes web development as promised, it will take a long time to wash out the gabillion lines of JS out there running the web today. That and the fact we'll probably be in a scenario where, "1/5th of the APIs are available with new tech 'x', and a roadmap is available for the remainder." Point being, set yourself up well to play this hand you're dealt today no matter how it plays out. Be smart and architect your application properly to be successful today and in the future no matter what technologies throw their hat in the ring to save web development. This includes the next best JS framework ever to be head and shoulders above them all!
Let's take a walk through history for a moment in web development starting 18 years ago in 2000. If you were to build say a banking or financial web application beginning at that time and continuing to keep current on a mix of .NET and client technologies, here is the journey you would might have taken:
Classic ASP -> ASP.NET Webforms -> ASP.NET Webforms + AJAX Control Toolkit -> ASP.NET Webforms + Silverlight -> ASP.NET MVC (aspx engine) -> ASP.NET MVC (razor engine) -> AngularJS -> Angular 2,3,4,5....
What does this all mean? The web layer is and always has been volatile. Maybe the buck stops with WebAssembly in the future, but that is still yet to be proven; that's another crystal ball moment. I want to use the history of the journey we've taken to help make sound decisions today. This leads me to the point that I'd take little stock in the front-end of our application, and put stock in the server-side code.
Here's an example. Back in 2002, you wrote that financial or banking application using C#. If you at the time had used sound OO techniques dating back to the 70's, unit tested the code, and wrapped it in abstractions to deliver the data, that code could be mostly intact today. I couldn't say that for probably a single line of code on the front-end.
I'm not oblivious to the changes in both C# and .NET technologies, so odds are code was probably refactored. However the most volatile part on the server is that being the manner in accessing the data and the manner in delivering the data. Sure there is volatility there as well. Maybe those core classes and business logic were once deliver via .NET Remoting, then an ASMX Web Service, followed by WCF, than WebAPI. However those should be ultra-thin layers acting as a means to an end to deliver the data. Same applies to accessing the data. You started with ADO.NET, then to LINQ to SQL, then to Entity Framework and through all of its versions. Again, this didn't necessarily need to change all of your core code; this is another layer that should be understood as being volatile.
So back to the web, today. I want to view the front-end layer as a thin and volatile layer that as history has shown has about an average of a 1-3 year lifespan it seems. I want the meat and potatoes of my code that matters on the server. I won't even get into the intellectual property or security considerations of why code should be on the server as I think that should be obvious with today's tech stack. I want to leverage the front-end code for only front-end matters. I want a pristine viewmodel of sorts returned via a service, where the only logic happening in the presentation layer is that of view manipulation and presentation logic. I've been on enough projects where IP and rules heavy work was done on the client (why, because you can!), only to have year long projects to refactor that code back to the server behind services.
Using an oversimplified view of the layers, let's look at the following using physical size to demonstrate placement and emphasis of code. Instead of a heavy client-side implementation like shown below:
I believe the implementation should be more along the lines of this based on everything discussed:
Put stock in your code on the server. History shows that in the battle of volatility the server-side wins out and the web is just too dynamic. Add to this security, IP considerations, and the unbelievably volatile and ever-changing world that is the 'web' it really is the smart move to make. The cynics will bring up leveraging CPUs on the client vs the server, but the world has changed. Data is small and efficient. The server has oodles of horsepower and is a known entity; we control it. The client is a wildcard; we don't know what they have. Long gone are the days where we know the client is a Windows machine controlled by an organization with a spec machine.
When analyzing data, as opposed to fat SOAP payloads of the past, payloads now are concise and to the point leveraging HTTP standards to deliver data as lean as possible. One must also yield to the fact that the days of 4GLTE and blazing fast WiFi and networking speeds are becoming the norm and will only continue to get better. It's pointless to argue that all apps must work flawlessly offline because that isn't reality across the board. We're in a world where if we aren't connected, things just don't work. Don't plan on making any airline reservations or bank account transfers when your device has no connectivity. I'll also avoid edge cases at this point as I understand there could be web apps made for 3rd world countries or the like where bandwidth is a premium, so data across the wire must be under the microscope. I'm sticking with mainstream, modern day, web development here. I'll leave the door open for you to evaluate edge cases that don't fit the 80/20 rule and call for an exception.
I can always go to the server and scrape off the services layer and re-introduce a new tech to replace that abstraction if needed. There are architectural recommendations within the server as well to make sure to thin-out the services facade and keep lightweight as that too as mentioned is know to be volatile over time. On the front-end, I want this same flexibility. Odds are I'll be asked to rewrite my web app in a few years to 'project x' tech which is the best thing ever for the web! I need to be lean up top and be able to scrape that thin veneer or icing off the cake, and put a new layer on easily. This isn't so easy to do if I've put the majority of my code on the client. When the boss or architect comes to me and says, "we are redoing our financial app using new front-end tech" I'll be positioned to say, "no problem, we can mitigate the ripple affect because the key logic and inner-workings of our app are stable and on the server, so we'll just need to redo the thin veneer that is the web-client code."
JavaScript in some respects is that mediocre poker hand we were dealt. However in the web world it's perceived weaknesses are at the same time it's strengths. Albeit a dynamic language that frustrates many, it's also flexible and powerful providing us with that "write once run anywhere" ability due to the fact of browsers all being able to interpret and run JavaScript.The language has IMO matured exponentially in the last 5-10 years, and with the addition of high level superset languages like TypeScript, working with JavaScript isn't the nightmare it once was.
However, you won't see any "I love JavaScript" bumper stickers on the back of my car. I spent years working in various languages like WinBatch, VB, VBA, VB.NET, and C# on the server or client. The reality is today those do us no good in web development on the client. Ah but you say, WebAssembly is here to save the day! Yeah I've heard this tune played before (cough, cough, Silverlight). C# running on the client, same business rules on the client and server, our woes are over, etc. etc.. Listen, I know Silverlight and WebAssembly are not the same at all. The only analogy I'm making is it's not the 1st time in history something has been deemed the "savior" to web development. Call WebAssembly 'x' for this purpose, and "project x will save the day!" In reality I'm actually quite optimistic about where WebAssembly is going to take us on the client. Imagine using C# for everything? That would be beautiful. For an optimistic view of WebAssembly and it's future with smart client development, check out Rocky Lhotka's post: A Bright Future for the Smart Client
Let's pump the brakes for a minute though. It's early 2018 and I'm expected to deliver today using the hand I'm dealt. In that case we are looking at a host of JS libraries and frameworks to help us build a responsive SPA when it comes to web development. I also have the option of using server-side heavy tech, but for the purpose of this post, I'll focus on the more mainstream approach these days using JS and building responsive JS apps. The bright side of all this is there isn't that "one way" to do development. I suppose that's a double edged sword, but the reality is it gives us options. Like ES6? Use it. Don't like JS? Use TypeScript? Like Angular? Nope. Use React or Vue. Like LESS? Nope. Use SASS. The options go on and on and on. I do agree in some ways this stinks. I often talk about the days early in my career where being really awesome at VB6 meant you could conquer the world, finish your job, and not have to worry about 1,000 surrounding technologies, languages, and libraries. I feel like times were a bit more straight forward. However back in 2002, it was accepted that everybody and their brother was using Windows + IE and that was it. Times have changed and so has the web. The expectation today is run anywhere. Thankfully, JavaScript has provided that flexible nature to allow use to be fluid and make our way to being platform and browser agnostic. Thus we can reach a lot of people with a single line of code.
So what does this all mean? Well I can say for one thing today: the web is a volatile space and likely to continue beings so in the near future. Even if WebAssembly or any other technology comes along and truly revolutionizes web development as promised, it will take a long time to wash out the gabillion lines of JS out there running the web today. That and the fact we'll probably be in a scenario where, "1/5th of the APIs are available with new tech 'x', and a roadmap is available for the remainder." Point being, set yourself up well to play this hand you're dealt today no matter how it plays out. Be smart and architect your application properly to be successful today and in the future no matter what technologies throw their hat in the ring to save web development. This includes the next best JS framework ever to be head and shoulders above them all!
Put stock in your code on the server
How do we play our hand to isolate ripple effects as much as possible as technology proves to be volatile? The answer might be in looking at statistics from the past, rather than looking into a crystal ball.Let's take a walk through history for a moment in web development starting 18 years ago in 2000. If you were to build say a banking or financial web application beginning at that time and continuing to keep current on a mix of .NET and client technologies, here is the journey you would might have taken:
Classic ASP -> ASP.NET Webforms -> ASP.NET Webforms + AJAX Control Toolkit -> ASP.NET Webforms + Silverlight -> ASP.NET MVC (aspx engine) -> ASP.NET MVC (razor engine) -> AngularJS -> Angular 2,3,4,5....
What does this all mean? The web layer is and always has been volatile. Maybe the buck stops with WebAssembly in the future, but that is still yet to be proven; that's another crystal ball moment. I want to use the history of the journey we've taken to help make sound decisions today. This leads me to the point that I'd take little stock in the front-end of our application, and put stock in the server-side code.
Here's an example. Back in 2002, you wrote that financial or banking application using C#. If you at the time had used sound OO techniques dating back to the 70's, unit tested the code, and wrapped it in abstractions to deliver the data, that code could be mostly intact today. I couldn't say that for probably a single line of code on the front-end.
I'm not oblivious to the changes in both C# and .NET technologies, so odds are code was probably refactored. However the most volatile part on the server is that being the manner in accessing the data and the manner in delivering the data. Sure there is volatility there as well. Maybe those core classes and business logic were once deliver via .NET Remoting, then an ASMX Web Service, followed by WCF, than WebAPI. However those should be ultra-thin layers acting as a means to an end to deliver the data. Same applies to accessing the data. You started with ADO.NET, then to LINQ to SQL, then to Entity Framework and through all of its versions. Again, this didn't necessarily need to change all of your core code; this is another layer that should be understood as being volatile.
So back to the web, today. I want to view the front-end layer as a thin and volatile layer that as history has shown has about an average of a 1-3 year lifespan it seems. I want the meat and potatoes of my code that matters on the server. I won't even get into the intellectual property or security considerations of why code should be on the server as I think that should be obvious with today's tech stack. I want to leverage the front-end code for only front-end matters. I want a pristine viewmodel of sorts returned via a service, where the only logic happening in the presentation layer is that of view manipulation and presentation logic. I've been on enough projects where IP and rules heavy work was done on the client (why, because you can!), only to have year long projects to refactor that code back to the server behind services.
Using an oversimplified view of the layers, let's look at the following using physical size to demonstrate placement and emphasis of code. Instead of a heavy client-side implementation like shown below:
I believe the implementation should be more along the lines of this based on everything discussed:
Put stock in your code on the server. History shows that in the battle of volatility the server-side wins out and the web is just too dynamic. Add to this security, IP considerations, and the unbelievably volatile and ever-changing world that is the 'web' it really is the smart move to make. The cynics will bring up leveraging CPUs on the client vs the server, but the world has changed. Data is small and efficient. The server has oodles of horsepower and is a known entity; we control it. The client is a wildcard; we don't know what they have. Long gone are the days where we know the client is a Windows machine controlled by an organization with a spec machine.
When analyzing data, as opposed to fat SOAP payloads of the past, payloads now are concise and to the point leveraging HTTP standards to deliver data as lean as possible. One must also yield to the fact that the days of 4GLTE and blazing fast WiFi and networking speeds are becoming the norm and will only continue to get better. It's pointless to argue that all apps must work flawlessly offline because that isn't reality across the board. We're in a world where if we aren't connected, things just don't work. Don't plan on making any airline reservations or bank account transfers when your device has no connectivity. I'll also avoid edge cases at this point as I understand there could be web apps made for 3rd world countries or the like where bandwidth is a premium, so data across the wire must be under the microscope. I'm sticking with mainstream, modern day, web development here. I'll leave the door open for you to evaluate edge cases that don't fit the 80/20 rule and call for an exception.
I can always go to the server and scrape off the services layer and re-introduce a new tech to replace that abstraction if needed. There are architectural recommendations within the server as well to make sure to thin-out the services facade and keep lightweight as that too as mentioned is know to be volatile over time. On the front-end, I want this same flexibility. Odds are I'll be asked to rewrite my web app in a few years to 'project x' tech which is the best thing ever for the web! I need to be lean up top and be able to scrape that thin veneer or icing off the cake, and put a new layer on easily. This isn't so easy to do if I've put the majority of my code on the client. When the boss or architect comes to me and says, "we are redoing our financial app using new front-end tech" I'll be positioned to say, "no problem, we can mitigate the ripple affect because the key logic and inner-workings of our app are stable and on the server, so we'll just need to redo the thin veneer that is the web-client code."
No comments:
Post a Comment