Does Anyone Really Know What ERP is?

A couple of other Veterans and I were recently trying to get clear on the distinction between SAP and Enterprise Resource Planning (ERP). We have all found that the terminology seems to be kind of interchangeable depending on who is doing the speaking. We were left wondering how many people really know what the difference is between SAP and ERP.

I declare right now that I am not an expert either on SAP or on Enterprise Resource Planning (ERP). I'll welcome the input of anyone who wants to improve my education on this.

I am an Ex Army Road Transport and Distribution Officer with Project Management experience who taught herself SAP in a Concrete Manufacturing Administration Role. I am first and foremost an exceptional SAP Data Entry Operator - called a Super User.

But wait - there's more!

One of the things that is built in to military personnel is the pursuit of excellence and attention to detail. For many of us it is simply impossible to say "near enough is good enough". We are always searching for a better way - and crucially - we will act to change a situation. Valuing information as a commodity or resource is part of our DNA. We also have a passionate hatred of rework and wasted time (at least I do). We work hard to develop the habit of stepping back and assessing situations objectively.

Basically - it was impossible for an insatiably curious perfectionist like me to simply sit back and enter the data like a robot. The understanding that things could be done better was an itch that I just had to scratch. I had to understand where my data was ending up, who was looking at it and what was being done with it - I had to do my reconnaissance. I didn't see any reason to lose that Army leadership habit of maintaining situational awareness - knowing a bit about everything and everyone around you - knowing what is going on.

So what do I know about Enterprise Resource Planning?

Concrete Manufacturing is about correct product, to specification, in full, on time - every time. My purchasing role (data entry on purchase requisitions) rapidly expanded into Plant resupply. Some of my supply chains were six months long and quite literally "on a slow boat from China". The regional plant location was tiny - so there was no space for storage. That gave a whole new meaning to "Just in Time" delivery. In nine years I only missed delivery in time for production once - by one day.

If ERP is about having your "bullets and bombs" where they need to be the day before they are needed then I am an expert.

Concrete Manufacture is an equipment heavy environment. Gantry cranes, forklifts, Pipe Manufacture installations (yeah - a whole damned building), Pre - stressing installations, Concrete agitators and Offices among other things. All require maintenance - often by specialist technicians who can not be on site inside three days. Profit margins are knife edge - so maintenance budgets are monitored under a magnifier. The fleet itself was all old - the only thing in the plant newer than twenty years old was the mobile crane. There was no redundancy on critical equipment like the mobile crane - no spare gear. Through various changes of management I found myself collecting cost capture on fleet items, predicting and managing maintenance budgets and contracts and co-ordinating the maintenance itself, right up to handling the money on capital acquisition.

If ERP is about keeping equipment working to keep manufacturing flowing then I am an expert.

What about the people? My job was data entry of payroll and collection of cost capture of labour hours. We had a lot of changes in management. Guess who became the single point of continuity for HR related matters for the team? I didn't do any hiring or firing but at various different times I maintained the personnel files, co-ordinated training, administered pay rises, co-ordinated leave times. Even handled welfare/rehabilitation.

If ERP is about ensuring you have the right people for the job then I am an expert.

And the tool I used to collect and analyse all of that information for forward planning?

Systems Applications and Products in Data Processing (SAP).

Thoughts on SAP from a Veteran Super User

In January 2011 I started a job with a local concrete manufacturing plant that produces things like box culverts, bridge decks and kerbing for a large multinational quarrying and concrete production company. I had some knowledge of database operation and was a fairly competent operator when it came to computers. All the same, my job interview included a lot of "I can learn".

Luckily, I was filling an opening left by retirement and the previous incumbent was kept on to train me in my role. Many of the processes in use in the office were still manual and paper based at the time (2011). Their chief operating tool was a database system I had never heard of - Systems Applications Products in Data Processing or SAP as it is most commonly known.

According to Google as of 2020 77% of the worlds transaction revenue touches an SAP system (I believe it) and 98% of the world's most valued brands are SAP customers.  As of 2020 SAP had the third highest revenue of any software company in the world behind Microsoft and Oracle. $16 trillion of consumer purchases worldwide go through SAP Systems annually (I believe this too!). Interestingly women make up just over 27% of the SAP management team.

You can read a lot of blue sky language about what SAP is and how it does things - Enterprise Resource Planning, Human Capital Management, synergistic process, optimisation, end to end solutions, business integration. If you believe the hype SAP will deliver all of your business information management solutions in one neat integrated package. SAP has an application product targeted at every function in every business (including military)- and they will custom build if you can pay enough money. 

The short version is that SAP is a really big, complex database - data goes in, data gets processed and decisions get made based on the data. Like all databases, it is only as good as the information in it and the people who operate it.

A login was obtained for me and I was introduced to the operating system by a woman who was clearly afraid of the application and did not trust her ability to operate the system - despite the fact that she had quite extensive training on SAP. The change management on the implementation by the multinational did not engender employee buy in on the application at operator level or in middle and lower management. Nobody was using the database at full capacity.

I administered Purchasing, Supplier Payment, Payroll, Inventory Administration (Stock Control), Maintenance Spending and Manufacturing Production Entry. The only key cash flow process I had nothing to do with was sales - but I had visibility on that and Distribution too.

The first thing I noticed about SAP was the somewhat confronting and unfriendly user interface (GUI). In true veteran fashion I ignored the interface and focused on learning the capabilities of the tool. After all, how much does it matter what a rifle looks like as long as you can rely on it?

The second thing I noticed was that the data across the board for this particular plant might have been about 30% accurate at best and that almost no-one trusted the stock levels, the purchasing or the payroll figures. Cost capture reports may as well have been fairy tales - and production quantities & outputs? - 80% accurate at best. The whole plant was riddled with redundant and parallel processes. Every time a customer wanted to buy something the yard man was sent out to visually check the accuracy of database stock levels - taking the only crane out of action for manufacturing. In short - they were entering rubbish into the database, and getting the inevitable rubbish out. Money was walking out the door and washing down the drains simply through slack information management practices and poor supporting processes. Of course SAP got the blame.

Enter the champion for accuracy and attention to detail - me! In true veteran fashion I just saw that something needed to be done and set about doing it. I started with the key entry nodes - making sure my production, purchasing and time sheet entry was 100% perfect at the point of entry - no amending after the fact. Making sure that what went into the Database was 100% perfect or as near as I could get it.

Then I set about fixing the people. I set standards, communicated them clearly and held people accountable to meeting them. I was a key player in driving compliance with stock control practices. I insisted on absolute and complete transparency in dealing with broken or substandard stock (no hiding stock write offs to make KPI's look good). People got paid correctly and on time every time. When it came to purchasing compliance there was nowhere to hide - I knew exactly where every dollar went and when. Suppliers loved me because they always got paid on time in full as agreed. Managers? - sometimes not so much.

By the time I left the company production entry into stock had been 98% accurate for about two years (my data entry accuracy was 100%). Inventory accuracy on Annual Stocktake was down to about a 7% variance (best in the country - and no fudging figures). My purchasing & supplier invoice payment KPIs were perfect. Most important to me - the team trusted me implicitly to get their pay cheques right.

I couldn't have achieved any of that without the out of silo visibility SAP delivers across business functions. If you know what to look for in SAP you can find everything you want to know about a business - even when you are "only the office lady" with restricted access. The good, the bad, and all of the ugly. I rapidly fell in love with the analytical capabilities of the database and used them to save millions for my plant over the years. I used it to compile maintenance budgets, crystal ball raw material stock forward orders, reduce capital tied up in obsolete stock, address payroll/labour inconsistencies, spot theft, and reward achievement.

SAP is big and complex and takes time to learn - but the payoffs are well worth the effort.

Israeli Fighter Pilots, Photons and Reckless Canadian Businessmen

While running an Excel boot camp, we were discussing ranking sales employees based on sales targets. One of our brilliant students brought up a supposedly well respected Canadian businessman who was known for doing just that, and then firing his bottom performing staff, every month.

I immediately thought of three things: Israeli fighter pilots, photon pairs at the Large Hadron Collider (LHC) in Switzerland and the search for loaded dice. 

I promise all of these are very relevant and I hope that by then end of this I will have convinced you to perhaps respect that canadian businessman a little less. 

The problem is this: we often misguidedly use events of random probability to confirm theories and justify practices.

Probability in Particle Physics

In 2010 CERN fired up the LHC and started smashing together hydrogen atoms, close to the speed of light, with the goal of finding new undiscovered particles. This creates a temperature similar to just after the big bang so we can see how high energy particles work. The high energy particles will then decay into smaller ones, which we can observe.

 The problem facing the experiment was how to pull a signal out from the random noise that filled their data. Most hydrogen collisions are pretty boring and don’t create new particles, but these still show up in the data.

The way particle physics works is based on conservation of energy. The important thing to know is that we know the new particle we are looking for should decay into 2 light particles (photons). So the detector creates a database which records the total energy every time that it sees 2 light particles at the same time. 

There are of course light particles bouncing around all over the place, so we expect to see a smooth distribution of background events, but since the photons from a specific new particle will always have the exact same energy, we will see a peak in that distribution. 

Simply put, we look for a little bump among the noise, then work out how likely it is that the noise made that bump. If it is likely that it’s noise, we ignore it, if not, we can assume it’s a particle.

Taken from the 2012 ATLAS higgs boson discovery paper. (The x-axis is the combined energy of 2 photons) 
The dotted curve shows the expected background.
The bottom section is what we see with the background removed.
The little bump shows an excess of what we expect if it was only background. This tells us that there is a new particle that doesn’t come from random background fluctuations.

In 2012 CERN measured “a signal with a significance of greater than 5 sigma”. This meant that there was a one in 1.74 milllion chance that the observed signal came from random statistical fluctuations. Given this is pretty unlikely, they concluded it wasn’t noise and was a new particle.

CERN announced the discovery of the Higgs boson or “God Particle”, much champagne was drunk and physicists the world over celebrated.

In 2015 the LHC wasn’t done searching. The Large Hadron Collider was still looking for a new particle. By collecting the particles that were spit out and adding up their energy, the team at CERN would be able to see a new particle as a peak in a graph.

The important thing to consider is the question “how likely is it that this signal came from random noise”. 

CERN once again measured a signal from 2 particles of light, this time at an energy of 750 GeV (about 3 times the mass of a Uranium atom). 

After measuring the statistical significance, it was found that the probability that this signal would be created by random statistical fluctuations was one in 15 787.

Theoretical physicists rejoiced and thousands of papers were published on how the newly discovered particle proved their theory, however, all was not well. This would later turn out to be a statistical fluctuation.

You can already see that this looks less convincing than the higgs peak.

The problem comes from the difference between global and local significance.

There was a 1/15 787 chance of finding that signal where we found it, but if you look long enough for something rare, you will eventually find something rare. The people hailing the new particle had looked at the probability of finding this signal in the exact spot they found it, but they had not considered the probability of finding it “somewhere or anywhere”.

The chance of winning the euro lottery is one in ninety five million. Does this mean we can take every lottery winner and lock them up for cheating? It certainly seems more likely that they cheated than the 1/95000000 chance they won fairly.

Of course not!

There is a one in 95 million chance of YOU winning the lottery, this is not the same as the probability that SOMEONE wins the lottery. 

The “local” probability of quadruplets is one in 570 000, but with 7 billion people on the planet, the “global” probability is quite high, just like the local probability of our new particle being noise was very low, but the global probability of finding any signal somewhere in the noise, was very high.

Israeli Fighter Pilots

During a training regime for pilots in the Israeli airforce, the trainers wanted to know the best method of improving pilot performance. 

The trainers tried out two different training styles.

When a pilot underperformed, they were admonished for their screw ups, whereas, pilots who did particularly well, were given compliments for their success.

A very good fighter pilot

On the next training run, what the trainers found, is that the pilots who had been given praise were less likely to do as well, while the pilots who had been reprimanded improved.

They concluded that negative reinforcement works, while positive reinforcement, not only doesn’t work, but it has a negative affect.

Why might that be? This flies in the face of nearly every controlled psychological study, so what was going on? Is flying just different to every other kind of learning?

As a dungeons and dragons player, I thought maybe I could adopt their techniques.

I started with 100 dice. 

After rolling them all 20 times and adding their scores, I took the 10 worst performing dice (naughty dice) and yelled at them, really just let them have it. I told them if they didn’t improve I would fire them and melt them down for spare plastic.

I then took the best performing dice (good dice), put them in a dice jacuzzi, offered them promotions and told them they were the best.

What do you know, it worked! The under-performing dice improved while the dice that did well initially, got worse. Apparently inanimate plastic dice share a fundamental psychology with fighter pilots!

Of course I’m being a little silly here but that’s the point.

If we look at a bell curve generated by random events, the edges of the bell curve are occupied by the best and worst performers, but random chance governs our lives more than we’d like to admit. 

If we select the worst performers, then let them try again, chances are that they will approach the average, and thus, improve. The same is true for those that overachieve, they will approach the average and thus, they will seem to get worse. 

This is called “Regression to the Mean”.

Would I have improved my scores if I fired my underperforming dice every 5 rolls? Of course not. The same is likely true for sales people (as it is for fighter pilots). 

That’s not to say that we shouldn’t be concerned with underperformers, but we need to consider why we are looking at a certain sample. 

If we already have concerns about a specific salesperson then a low sales score could indicate lacklustre performance, but if we select someone precisely because of a single low score then that must be the START of data taking. Otherwise you will likely be firing your best salespeople because, one month, random chance put them on the left hand side of the bell curve.

Firing your underperforming employees every month might make you look tough, but anyone with an understanding of statistics should realise that you are a very, very, silly business person who is likely blowing way too much money on HR and severance packages.

Veteran's groups that can analyse the facts - Can help other veterans!

As Guy Parent, the Canadian Veterans Ombudsman in 2015 would say,

"There is nothing more powerful than freeing the facts. Facts arrived at by rigorous research and evidence-based analysis generate and focus debate. They empower citizens and enhance citizen engagement with government. This combined effect cannot easily be ignored, and it creates the conditions needed to shape public policy."

We need to think about the data we need to know, prioritize the questions, collect, clean, analyse and present so that leaders, public servants and public. To help the veteran and citizen

Data Challenge #2 - Maccas

Challenge your Data Analytics skills!

How it works:

Each fortnight we will present a data challenge to be solved. These challenges will be a mix of Excel, Power BI, Tableau and SQL and are designed to increase your proficiency in skills across the data pathway. We encourage everyone on the Data Pathway to take a crack and challenge yourself!

The winners will be announced on the SitRep, fortnightly on Fridays at 4 pm AEST.
The stream can be found here: https://www.twitch.tv/withyouwithstream
and winners will receive a badge to proudly display on the WYMW Data Discord!

Challenge #2 - Maccas

Download the Dataset here

Scenario

You are working with a team of Data Analysts currently doing a health survey of fast-food chains in the USA. You have been given Maccas (McDonalds) to analyse.
Your task is to determine the correlation between various metrics such as protein, sugar, fat etc - and effectively plot the relationship.

Challenge

Using Excel, Power BI or Tableau, complete the following:

1. Create a Correlation Plot to answer the following questions:

Q1 - Correlation visualisation of Calories, Calories from fat, Carbs, Cholesterol, Dietary fibre, Protein, Saturated fat, Sodium, Sugars, Total fat and trans fat
Q2 - Identify the correlation between sugars and carbs - Is it correlated? If so, is it positively or negatively, strong or weak correlation? 
Q3 - Create a slicer showing the Correl between Protein and total fat within the Beef and Pork Category
Q4 - Using the slicer show the Correl between calories and carbs within the breakfast menu category
Q5 - Using the slicer find the ratio of protein to sugars between within the Iced mocha menu item 
Q6 - Create a dashboard with a Card, slicer and visualization demonstrating the correlation between 2 different measurements within 2 different menu categories 

2. Create a Scatterplot to answer the following questions:

Q1 - Create a scatter plot identifying protein and total fat adding a trend line, is there a correlation?
Q2 - What is the median of the data presented and what is the average of the data presented?
Q3- Add 95 percentile lines on the scatter plot, which items are within this range?
Q4 - Which item is within the 99th percentile range?
Q5 - Create a dashboard with visualisation of scatter plot, correlation card and slicer (for menu category and item category)

Lastly, to practice your skills, you can complete the following questions:
(note these last questions will not be assessed for the challenge, but are good practice!)

Q1- AVG total calories of each ‘category’ menu item -- (which has the highest avg)
Q2 - AVG total calories of Fat in each ‘Category’ menu item --(which has the highest avg)
Q3 - Total fat in one Specific Menu category -- (specific which menu item this is)
Q4 - Percentage difference between highest total fat in breakfast item compared to lowest total fat in breakfast item)
Q5 - Count of each ‘category’ of breakfast item -- ( which item has the highest count of menu items/ which has the lowest)

Submit your answers

Submissions close 04/12/2020 at 12:00pm AEST.
Submit your answers directly to Jimmy via Discord.
Good luck!

If you’re having any difficulty, jump into our Data Analyst Discord Server and join the discussion!

Not yet on the Data Analytics pathway? Click the button below to explore our courses:

Tip for Tracking Cell Refences in Excel

If cell referencing in Excel sends you cross eyed then this little tip will get you out of trouble. So good even the little birdie wouldn't stop singing about it!

Data Noob - Data Warehouse Vs Data Lake

When I  stepped in to the Data Analytics world I encountered a mind boggling array of confusing, apparently ambiguous and difficult to decipher terminology. All sorts of cute acronyms, dazzling visualisations, pretty dashboards and "snake oil" solutions to business pain points.

Be gentle with me - I'm a Noob in this space but I'll welcome conversations in the interest of learning more.

What on earth are Data Lakes, Data Warehouses, Data Clouds, Data Deserts, Data Silos, Data Swamps? Data is the new oil? Yes, Data Ecosystems are a thing. I haven't found evidence of a Data Porcupine yet - but if one exists it will probably be in the Cyber Security world.

Let's start with a Data Warehouse. A country woman, I've found a surprising real world equivalent to a Data Warehouse. I've included in this article a picture of a fairly complex Bulk Grain Silo Installation which works fairly well as a rough visualisation of how a Data Warehouse operates.

Grain handled in bulk is stored in silos which generally only contain one type of grain - for example, wheat, corn, or barley. Types of grain for storage are designated to silos before loading. Grain is loaded and unloaded in a highly organised, linear process. One truck lines up behind another to load to, or out of the designated silo. Grain is moved around the installation using either augurs or conveyor belts. Grains can only move in pre-determined directions - up, down, in, or out. Grain has to move in pre determined paths before it can be mixed or processed. Grains may be mixed or processed in some way to produce a designated value added product. At some point a human must be involved to ensure that the correct raw materials are mixed together to produce a saleable, quality assured, digestible product. When bulk grain moves it has viscosity - it flows and it gets blocked. Movement, age and use degrade quality. Grain has to be cleaned and preserved against invaders.

If you call a grain a piece of data you can get an idea of how a data warehouse stores and processes data. Designate the wheat silo "Sales Data". Designate the corn silo as "Costs". Call a conveyor a "Search" and call a feed mixer a "Report, or an Insight". Call the retail packaging a "Visualisation".

Can you see the similarity?

By contrast a Data Lake is a much more fluid thing. First of all you can store a lot more in it and the potential for scaleability is a lot higher. They can exist anywhere the "landscape" allows for them. You can draw a cup of water, or if you have the right tool you can empty the whole lake and wait for it to refill. You can draw water from any location in the lake. The water will have trickled or flowed into the lake from any number of sources - eg clouds, rain, runoff, soakage from underground, illegal industrial dumping, rivers, or some kid emptied their lunch water bottle in to it. The water may have been in the lake for a hundred years, or it may have filled yesterday from flooding. Yes - people do talk about data swamps - where the water is muddy, sludgy and slow moving.

Call a water molecule a piece of data. Clearly in a Data Lake you need special tools to refine, package and move the water. The data does not become actionable information/insight until it is packaged in a digestible fashion. On the upside you get access to datasets only limited in size by the amount of storage you have available.

Do you see the connection?

Data warehouses are structured and organized but have limitations in terms of size and scaleability. In a data warehouse the interrelationships between data types must be pre determined - not so in a Data Lake. Data lakes are huge, flexible, and scaleable. You get access to a huge dataset but it takes more effort to structure the data in order to produce insight and understanding. In some ways a Data Warehouse is often needed within a Data Lake to structure information so it can be interpreted by a human being. Humans beings generally struggle to reduce unstructured data to a digestible form by ourselves - there are simply too many things to look at. Data Lake analytic tools basically take unstructured data and turn it into structured data so human beings can interpret it.

Of course this is a very over simplified version of the technicalities of how Data Warehouses and Data Lakes operate. We all need start somewhere right?

At least that's how this Data Noob sees things.

Fortnightly Data Challenge #1 - Museums

Challenge your Data Analytics skills!

How it works:

Each fortnight we will present a data challenge to be solved. These challenges will be a mix of Excel, Power BI, Tableau and SQL and are designed to increase your proficiency in skills across the data pathway. We encourage everyone on the Data Pathway to take a crack and challenge yourself!

While you will be able to compete in the challenges with just the skills learnt in the course, to really excel and come out on top we encourage you to do some research of your own! After the challenge is complete, I (Jimmy) will go step-by-step through the challenge with a video walkthrough. Submit your answers through Discord - and follow along for all updates and discussion!

The winners will be announced on the SitRep, fortnightly on Fridays at 4 pm AEST.
The stream can be found here: https://www.twitch.tv/withyouwithstream
and winners will receive a badge to proudly display on the WYMW Data Discord!

Challenge #1 - Museums

Download the Workbook here

The workbook contains the following sheets:
1. museums_USA
Your dataset for the challenge. Contains name, type, location, income and revenue for museums in the USA.

2. challenge questions
This sheet has 10 questions to be answered using COUNT, SUM and AVERAGE formulas, including IF and IFS.

3. Selector Challenge
This sheet contains the scenario, the challenge and points score.

Scoring

You will be scored on both the Challenge Questions (score out of 10) and the Selector Challenge (score out of 5) for a total of 15 points.

Selector Challenge

Scenario

You are a Data Analyst at Museum HQ™ in the USA.

You keep recieving requests from managers of states asking: 1. How many museums are in their state, 2. the sum and average revenue per museum, and how does this differ by museum type.

Rather than querying the data each time you receive a request, you have decided to build an interactive worksheet for users to make a selection and receive the metrics they need. 

Challenge

Build an interactive worksheet that shows the Count of Museums, Sum of Revenue and Average of revenue for the selected Museum Type and State. You can use the cells to the left or create a new worksheet. Points are awarded based on the levels of interactivity for users as per the below table. 

Points Score: Levels of Interactivity

Users can type or copy/paste directly into cells1 point
Users can select from a dropdown2 points
Users can type & the dropdown list filters3 points
Allow the use of wildcards+1 point
Users can select any number of selectors+1 point

Submit your answers

Submissions close 30/10/2020 at 12:00pm AEST.
Submit your answers directly to Jimmy via the Data Discord.
Good luck!

Not yet on the Data Analytics pathway? Click the button below to explore our courses:

What would an ex military cowgirl do in a Hackathon team?

I first saw the word "Hackathon" and thought it might have something to do with clearing scrub. You know - a bunch of people get together to hack away at weed eradication. I saw the term "Pen Testing" and thought "don't they test the Pens before they leave the factory?"

When one of the jet techie types I work with suggested I join a Hackathon team (after gently explaining what it all meant with a straight face) my first thought was:

"What could a middle aged, ex Army, office lady/cowgirl with almost no understanding of cyber security possibly contribute to a gun Hackathon Team of Cyber Security Jets?" So I rounded up my friendly neighbourhood Cyber Security Jet and asked the question.

Turns out the answer is: "quite a bit".

So what is a Hackathon?

Basically, a mission/objective is set and teams are put together with the desired skill sets to achieve it. Teams may require a  whole range of skill sets including Data Analysts, Penetration Testers, Cyber Security Specialists, Database Operators, Baristas (coffee is a must), IT Technicians, Project Managers, Team Captains, Subject Matter Experts - any skill set relevant to the objective.  You may be unable to operate a computer and still be a key member of a Hackathon Team. The sneaky mindset of the successful wargamer or tactician is highly valued. If you're not so great on a computer or new to the scene you still bring a very valuable objective perspective to the table and add to the diversity of the team.

No different really to the way military teams are organised and configured really. If you need a driver, a demolitions expert and an interpreter then that is how you configure your team.

Mission types vary. Cyber Security and Penetration Testing are perhaps the best known. Hackathons are also used for Intelligence Gathering. Sometimes they have Programming/Coding Competitions - sometimes it's about developing new applications or showcasing new ideas. Hackathons are fantastic networking opportunities for industry specialists and a great way to practice/display your skills - especially if you win! You can use them to gain experience for your Resume and make business contacts.

Teams are usually put together well before the event to allow for troops to task allocation, setup and rehearsal. Most of the time a reasonably decent laptop with webcam/mike and standard internet connection are all you need. Most of the popular operating systems are more than up to the task. My Cyber Security Jet says that YouTube would be your best friend when it comes to learning the software. You should expect to have to download software.

So what can someone like me offer the team? Analytical mindset, curiosity, communication skills, organisational skills, leadership skills, life experience, strategic and tactical problem solving, "think outside the box" creativity, agile mindset, diversity, range, motivation, drive and tenacity. As a recent major Hackathon Winner said to me - one of the advantages Veterans have is that we just keep trying  until we find a solution.

You better believe that a veteran can contribute to a Hackathon - regardless of the level or nature of your computer skills. I'm just about to enter my first Hackathon - I'll let you know how it goes.

'Bad data in - Bad data out'

For those veterans that are still wondering what data analysis is and what role they can play! Here is a quick story to get you read in.

Veterans out there probably look at the chart below and wonder...what are these people talking about?! Every time I talk to someone new to the space they say - "Data Analyst - that is like Artificial Intelligence right." The answer is - 'not really - the machine learning and AI algorithms are only a small percentage of data analysis.' You can't enable the top of the pyramid if the foundation and the lower rungs are not constructed properly. I will explain....

A very complicated picture.

In the Army, you need to organize how you bring in data to the decision making process. If the section commander needs to make a decision, he needs to collect it - shots fired, 'what was that', ask questions and demands 'send situation report' - move/store it - write it on their map, send by radio - transform it - prepare a contact report and/or medical evacuation request - aggregate it - make a decision, call for help - learn from it - After Action Review - optimize it - run the scenario in a simulation 1,000 times and see if there were better options - this is also done through repetitive, challenging and interesting training.

So what? If you are collecting the wrong/inaccurate information, not storing it properly, preparing incomplete or faulty reports, that lead to bad decisions that support zero learning - good luck optimizing. 'Bad data in - Bad data out.' You need to properly collect, store, transform and aggregate to make good decisions, learn and optimize.

Computer Errors: What can you do?
After pushing the optimize button on Artificial Intelligence using 'bad data in - bad data out'.

There is a risk when we all move toward the shiny AI and machine learning solutions and think that it will solve our problems. Small Wars Journal points out that, "The real concern, though, is that military leaders may not comprehend significant risks associated with blindly using such tools." Getting a computer programmer to feed a bunch of data into a machine learning tool - while asking 'what's a battle?' is not going to optimize the problem. That is because, they don't understand the problem. This is where you - the veteran - is critical in government, business or military to solve problems.

Further, as Deloitte points out, there is a concern of data overload. Sending the section commander 10-figure grids by voice, the color of the sky and everyone's favorite sports team during a fire fight is not going to help. As Deloitte points out, Commanders want to know “How does any of this information help me understand . . . what [operational] decisions are needed? Most of this is just information without analysis.” They will need your help. They need an analytics translator.

You are able to use the four step process below to understand the data. You define "I want to know where is the enemy hits the section?" - transforming the data by getting grids, contact reports, interviews with the section to understand their perspective - you analyse what you have and communicate your findings. As of today - a computer can't do this. They need real world problem solvers to investigate and ask the right questions. That is where you come in. As Deloitte points out in the section "Anticipating enemy fire: Mission analytics for threat assessment," if you do data analytics properly and define the right questions, you can start to know where the hostile actions will occur before they happen. That is what you would provide the company/government/military when trained. You are working out what information is important to the organization to make the right decisions - preferably before the firefight.

WYWM Basic Excel and Analysis Skills - Common Errors - Rallypoint

What is your role in Data Analytics? It is asking the right questions, seeing what data is out there and then asking more questions....WYWM will give you the tools to help transform, analyse and communicate - but the tools are useless if you don't define the problem. If the section members are feeding the Section Commander 'bad data in,' then he will be forced to give 'bad data out.'

Working in Kandahar Afghanistan - helping everyone do data analysis...

Caleb