Our website uses cookies to improve your online experience. By continuing to use our website you consent to cookies being used according to our Privacy Policy.

OK

Hiring a Freelance Big Data Architect

Why and when you need a Big Data Architect

Big Data Architects are one of the most demanded professional profiles in the IT sector. Their main goal: turn data into information that facilitates decision-making. However, when facing any project involving big amounts of data, stakeholders will face an overwhelming amount of questions:

  • What strategy should we adopt for collecting and storing data?
  • what should we do with entries which miss some chunks of data?
  • What software technologies should we use to manage and analyze our data? (Hadoop, Elasticsearch, Storm, Spark, pyspark, python, R, Tensorflow….)
  • What hardware? Should we run our software on our machines or should we externalize the service? And in the latter case, all of it or just a part?
  • How can we extract the most information from our data?
  • What algorithms do we need? What features do we need to extract from it? How can we reduce dimensionality?
  • What models should we use to make predictions/recommendations?
  • And the list goes on and on and on...

It is really overwhelming. You are an expert on your field/department, a really good one, but you have opened pandora’s box. These questions are too technical and have a too deep impact on the project to be taken lightly. In addition, there is no correct answer to them. Answers depend on many things, like your objectives, the data you already have, the system and workflow you already have in place, the money you want to spend…

Big Data holds all answers

You need a big data architect with whom to work and design a strategy and a roadmap so your project has the best outcome possible within budget and deadlines.

The Big Data architect you need, in addition to extensive technical knowledge on data management, programming, modelling, machine learning algorithms etc… has high analytical, organizational, decision making, social and communicative skills and is capable of understanding your business and organisation. 
Let us once more highlight the importance of the intrapersonal and communicative skills for him/her to deal with technical staff on one side and non-technical staff like managers, CEOs, business analysts etc on the other side.

Outvise finds you the right big data architect you need for your project.

Find and hire the best big data freelancers to work onsite or remotely.

Certified Big data architects in the Network

case study
Big Data Architect case study for a network of retailers

Challenge, Context, Problems to be solved

A network of retailers needed to implement a system capable of:

  • Predicting the need of resources of each retailer in the network
  • Allocate resources to them

They had implemented an in-house solution that made the predictions using a neural network (Tensorflow) and allocated resources using an OR (operations research algorithm) implemented in Python (pulp and CBC), spark and pyspark. Everything was running on the client’s machines gpu. It was a great start but had mainly two performance problems: the prediction of needs was not accurate enough and the overall execution time was way too slow.

Time to hire a Big Data architect!

Mission, tools and methodology

A significant increase (up until it met requirements) in prediction accuracy was achieved by:

  • Improving the data quality by preparing and cleaning it and adopting a missing data policy
  • Reducing the data dimensionality by removing redundant data. This was achieved doing a principal components analysis and selecting the most relevant pca variables.
  • Changing the neural network architecture (number of of neurons and layers).
  • Changing the loss function (how you reward the NN when training it)
  • Changing the features that were fed to the NN

Achieved results

Execution time was improved by redistributing the machines where each algorithm runs, NN are parallelizable by definition and run great on gpus but OR algorithms are not and run much faster on cpus. In addition to running on cpus the OR algorithm used was changed together with the overall approach to the problem. The result was that the system performances met our clients requirements on both accuracy and running time.

Must have Big Data Architect skills

  • IT Certifications or a Bachelor’s degree in Computer Science or in a related technical discipline and extensive work experience (10- 12+ years of overall IT experience with Big Data, Analytics, Data Warehousing and Business Intelligence)
  • Handle data technologies that are latest such as; Hadoop, MapReduce, HBase, oozie, Flume, MongoDB, Cassandra and Pig.
  • Experience in data warehousing and mining. 
  • Programming languages and all the technologies that are latest. All kinds of JavaScript frameworks like HTML5, RESTful services, Spark, Python, Hive, Kafka, and CSS are few essential frameworks.
  • Machine learning skills: pattern recognition, clustering for handling data and text mining are a few essentials.
  • Should be able to work in cloud environments and also have some knowledge of cloud computing.
  • Teamwork abilities: The big data architect must be able to work in a team-oriented environment with people of diverse skill sets
  • Communication skills: Big data architects are required to engage with clients and stakeholders to understand their objectives for setting up Big Data and apply it in the architecture.