Creating a list workflow for SharePoint Online 2013 using Visual Studio

By Satalyst Brilliance on 01 Dec 2014

A few months ago, we migrated from SharePoint 2010 on premise to SharePoint Online 2013 – a huge paradigm shift for us and our custom workflows. If you’re reading this blog you’ve probably noticed that documentation for SharePoint Online workflows is hard to come by, especially using Visual Studio rather than SharePoint Designer. So I’ll share my experiences and hope it helps. Calling RESTful Services The first pain point that I encountered was migrating to declarative workflow and losing code-behind functionality. All custom work must be done in a RESTful service, the response comes back as JSON, and must be parsed out of the new dynamic value type. The syntax is quite verbose: I assign a string variable ServiceCall to the endpoint plus the method, like so: ServiceCall = “” Whether or not your method is returning a value, it should be tagged in the service interface with the WebGet attribute. Then I execute an HttpSend activity. The ResponseContent must be a variable of the new DynamicValue type. Finally, I parse the dynamic value response. In this case, the response JSON looks like this: {“IsCubeProcessingResult”:{“Status”:”COMPLETE”}}           So parsing the response using the GetDynamicValueProperties activity looks like…

Read More

Discovering the Business value in Business Intelligence

By Satalyst Brilliance on 23 Dec 2014

In my previous life as a provider of Analytics, Reporting and Data management I experienced first-hand the sudden surge in demand for historical, current and predictive views of business operations. I also experienced some the challenges individual teams and whole companies faced as a result of rapid business growth and through lack of co-ordination across teams. Like many companies, my previous employer lacked a strategic, cohesive approach across business units for the planning and application of Data Management, Analytics and Reporting. Some of the challenges we faced as a result included duplication or the unintentional differing application of ETL processes. For the team, this meant database refresh times were taking longer than necessary to complete. The refresh occurred every 2days, involved two ETL processes and required hundreds of transformations – typically taking up to 12 hours to complete in total without any time accounted for error handling. This meant we were only able to start querying the database mid-morning the following day and left us unable to run any scripts overnight whilst the refresh was happening. Different ETL processes also resulted in slightly different resultsets, although not a regular problem, altogether this directly impacted our ability to leverage BI in…

Read More