- How About Computing Data Outside Database to Alleviate Data Warehouse Expansion Pressure
- The embryonic days of the data journalism industry are upon us
- Another 10 external articles and resources worth reading
- The Forrester Wave: Big Data Hadoop Solutions, Q1 2014
- Big Data & Analytics Heroes: John Naduvathusseril
The todo process is based on agile principles (you find them below) and the goal of my todo process is to facilitate three things:
- to PLAN: I get an overview of all potential todos and break them down into 25 minutes tasks
- to PRIORITIZE: I make sure that I do more than things right, but the right things!
- to REVIEW: I can check the progress on what I have done weekly “You can only manage, what you can measure”
Tools you need are:
The todo workflow:
At the project beginning I start with the planning game
- The “customer” is asked for the desired features. This features are the goals/use stories/hypothesis. The customer can be an external person or myself. Important is the interview technique for this. This is done by brainstorming and writing the general hypothesis for the process steps down in asana.
- New Tasks are added with Tab-Q: I select an action title, a description like an web url, assign it to the person and select a project Title: Write down the result, with specific words and details … so that a assistant could do it.
- Go to my Inbox (Task assigned for me within 1 day) and priorize the tasks according to the eisenhower matrix
- Important: What happens if I don’t do it? Does it damage my career (3), damage my personal relationship(2), damage my health/self(1). (Tab-H) Write down the goals for the next 1-2 years and what has to happen (persons liking you, …). If a task contributes to goal then it is important.
- Select all and send to upcoming (Tab-L)
- Urgent: When is it due? Give a date otherwise check in a month! (Tab-D) Date: Have always 3 days puffer!
- The “engineer” estimates the difficulty of the task.
- By creating dummy subtask for every (25 min pomodoro).
- Decide on the goals/hypothesis for the “2 week sprint”.
- identify the right question to prove the hypothesis and the test assessing that the goal has been done.
- Break the selected big goal down into small 25 minute tasks. This is important so I am not overwhelmed and also that I can do work in small breaks.
Start working from Category 1 to 4.
- Short term todos that impact my long term goals. Do them now and with high accuracy. Try to prevent this the next time by planning better or saying no / telling somebody that I don’t appreciate such short notice work.
- Not urgents things that impact my long term goals. I plan specific times to do this todos with high accuracy.
- Not important to long term goals but urgent. I delegate them or do them quick by the 80/20 rule.
- Not important and not urgent: Do I really have to do them?
Do a Daily Scrum everyday after waking up 3 minutes:
- What did you do yesterday?
- What will you duo today?
- Are there any impediments in your way?
- Collective Code Ownership:
- find feedback groups for everyproject and demand active feedback.
- as well as put my “code” online so everyone in the group can track and improve it.
- Small Releases
- Integrate so you have always a complete version at the end of the day.
- Simple Design
- Just a short structure how to reach the goal. Work with hypothesis and try to prove them. Use this to break down big goals into small tasks.
- Customer on site:
- Always be in close contact with the customer.Check at least once per week (either with a significant improved version or a question catalogue that is need to overcome the challenges that prevented the significant improved version last week).
- Don’t plan extensively for every eventuality. Focus on the next sprint. If something is wrong, change it.
- “Coding” standards:
- Define standards and make them available to everyone that works on the project and inform yourself on existing standards in the project. Publish your standards on your blog in the category workflows.
- Test driven development:
- Before you start “implementing” / prove the Hypothesis define the test when the hypothesis is proven or the goal is reached. E.g. What questions does a reader have to answer after a chapter? Check this by letting somebody read the chapter and ask the question).
- 40h week:
- work a maximum of 40 hours per week. Track this.
- If you are not sure if this is the right way. Start with a prototype and experiment and learn by “try and error”. Like instead of writing a complete chapter make a fast research round and then write a short agenda/draft for a chapter.
Possibly a slight overstatement But the following news story via the BBC has a very interesting perspective on big data. The article includes some interesting and innovative use cases and shows us to think big when we talk about the potential uses of big data.
The world’s population is projected to grow to 9 billion by 2050, and the Food and Agriculture Organization of the United Nations believes that food production will have to increase by 70% in the next 35 years to prevent widespread hunger. But the increasing use of farmland for biofuel production means that there is less land available for food, and about half – or two billion tonnes – of the food that is produced is wasted, according to the Institution of Mechanical Engineers.
The information stored on each e-Pill will be transmitted wirelessly to receivers as cows pass by, and then through the internet to Vital Herd’s cloud-based herd management software. This will collate and interpret the data about each animal so it can be viewed by farm managers.
Mr Walsh believes that more productivity benefits will be realised by analysing historical data from a wide range of cattle. “If we can aggregate data from customers in different regions we could do industry benchmarking and studies to link productivity to vital sign data and genetics,” he says.
For example, the Climate Corporation, a company founded by two ex-Google employees and acquired by agriculture giant Monsanto in 2013, operates a cloud-based farming information system that takes account of weather measurements from 2.5 million locations every day.
The system also uses daily weather data from the past few months to provide farmers with yield estimates for their crops in individual fields, and it allows them to explore historical data from the last thirty growing seasons to provide an accurate estimate of the value of fields they may be considering buying. But even if crops, dairy products and meat can be produced more efficiently by making use of big data, it’s a major undertaking to get it from the farm or abattoir to the dining room table.
Tech Mahindra, an IT service company based in Bangalore, India, offers a system called Farm-to-Fork which aims to monitor containers centrally, sending alerts out whenever the conditions in a container deviate from their ideal ones.
If automatic adjustment isn’t possible, humans can intervene. “For a ship on the high seas, an alert message goes to a technician to see what action can be taken,” Mr Vasudevanallur says. “With a truck, a driver can go to the nearest depot to get things fixed rather than driving on to his final destination.”
As part of its quest to develop cognitive systems, IBM Research is exploring whether a computer can be creative by designing a machine that can create surprising yet flavorful recipe ideas no cookbook has ever thought of in order to enhance human creativity.