Traditionally Product Managers working on software products and platforms have made decisions based on sales insights, customer feedback and feature requests. With big data and analytics being the buzz words, the focus has changed more towards a metrics-driven feature prioritization. If the numbers do not support a feature, a feature is critically debated on its priority until there are numbers to prove otherwise. Thanks to analytics and advances in data mining, product managers these days truly understand their customers, often way better than even their customers understand their own situation.

Industry has pretty much standardized the analytical data models required to make decisions, but in order to make such decisions quickly it is important to have ability to tweak a product and deploy environments as and when required. Long application release cycles make it difficult to stay ahead of competition. Imagine a feature production ready for analysis and delayed for ramping up infrastructure and getting environment ready and this is the exact problem that Dockers solves.

Dockers is an open-source project that automates the deployment of applications inside software containers. So, what is Software Containers? Containers are a method of operating system virtualization that allow you to run an application and its dependencies in resource-isolated processes. Containers allow you to easily package an application’s code, configurations, and dependencies into easy to use building blocks that deliver environmental consistency, operational efficiency, developer productivity, and version control. Containers can help ensure that applications deploy quickly, reliably, and consistently regardless of deployment environment.

Teams can now stop wasting hours setting up developer environments, spinning up new instances, and making copies of production code to run locally. With Docker, you simply take copies of your live environment and run them on any new endpoint running a Docker engine. Packaging an application in a container with its configs and dependencies guarantees that the application will always work as designed in any environment: locally, on another machine, in test or production. No more worries about having to install the same configurations into different environments.

To get started with Dockers Or for a more in-depth look watch the video below:



It would be interesting to know how product teams can leverage Dockers. If you have any brilliant ideas please share in the comments below.

WCF REST provides flexiblity to transport data over http and surely should be preffered if data is required to be accessed from applications build in different technologies. There are times when this is not possible or when all the involed web service consumers or service clients are build on .Net technology, chances are they may prefer using SOAP over REST then REST and SOAP both endpoints need to coexists in a single service. This series would provide provide tips about combining SOAP with REST in single WCF Service.

I have compared how Behaviors would look for just SOAP VS SOAP+REST service.

Behavior defines how the endpoint interacts with clients. Attributes like security, concurrency, caching, logging, etc are part of service behavior. Behaviors describe the client what the service will be.Eventhough behaviors can be added in three different ways like code, configuration and Attributes, I will use configuration for this example. Below is a comparison of how your SOAP vs SOAP+REST behaviors would look like in configuration.

SOAP:

 <behaviors>
 <serviceBehaviors>
 <behavior name="SoapBehave">
 <serviceMetadata httpGetEnabled="true" />
 <serviceDebug includeExceptionDetailInFaults="false" />
 <serviceThrottling maxConcurrentCalls="100" maxConcurrentSessions="50" maxConcurrentInstances="50" />
 <dataContractSerializer maxItemsInObjectGraph="2147483647" />
 </behavior>
 </serviceBehaviors>
 </behaviors>

SOAP+REST:

<behaviors>
<endpointBehaviors>
<behavior name="web">
<webHttp />
</behavior>
</endpointBehaviors>
<serviceBehaviors>
<behavior name="internal">
<serviceMetadata httpGetEnabled="true" />
<serviceDebug includeExceptionDetailInFaults="false" />
<serviceThrottling maxConcurrentCalls="100" maxConcurrentSessions="50" maxConcurrentInstances="50" />
<dataContractSerializer maxItemsInObjectGraph="2147483647" />
</behavior>
</serviceBehaviors>
</behaviors>

 

Today I was faced with a challenge in a WCF Service where my host service returned http 504 response.

HTTP/1.1 504 Fiddler – Receive Failure
Date: Wed, 15 Jan 2014 18:53:11 GMT
Content-Type: text/html; charset=UTF-8
Connection: close
Timestamp: 10:53:11.745

[Fiddler] ReadResponse() failed: The server did not return a response for this request.Server returned 0 bytes.

Initially my thought was this issue is due to WCF REST POST request without any inbound request data stream, but, after a bit of research I found that my hosted service was unable to output the huge xml data object I was returning from the datastore. I resolved this issue by adding maxItemsInObjectGraph in Service behaviour’s dataContractSerializer element of web.config.

I think WCF or any API’s is a game of correct configurations and understanding what this configurations mean is important for API design. Let’s understand what maxItemsInObjectGraph and dataContractSerializer is and how to use it in service configuration of a API.

As we know Service Behavior defines how the endpoint interacts with clients. Attributes like security, concurrency, caching, logging, etc. – are all part of the behavior and so does dataContractSerializer. DataContractSerializer serializes and deserializes an instance of a type into an XML stream or document using a supplied data contract and maxItemsInObjectGraph gets the maximum number of items in an object graph to serialize or deserialize. Why 2147483647? maxItemsInObjectGraph is a integer and integer value is a 32-bit number, and that’s the largest value that can be represented in 32-bit two’s complement. This will allow to obtain a xml/json response as huge as aproximately 2GB. I think if you are not sure or if you think data might grow over the period of time it is better to set this property to max.

Here is how my service behavior code block in web.config looks like.

       <serviceBehaviors>
        <behavior name=”ServiceBehaviors”>
          <serviceMetadata httpGetEnabled=”true” />
          <serviceDebug includeExceptionDetailInFaults=”true” />
          <serviceThrottling maxConcurrentCalls=”100″ maxConcurrentSessions=”50″ maxConcurrentInstances=”50″ />
          <serviceAuthorization serviceAuthorizationManagerType=”CCAService.AuthorizationManager, CCAService” />
          <dataContractSerializer maxItemsInObjectGraph=”2147483647″/>
        </behavior>
     </serviceBehaviors>