Thread 10

The paper is really interesting because it talks about an issue that is becoming more and more relevant at this time. This is the idea of data processing, this task is becoming crucial for so many different tasks inside a company. This could be how do we handle logs, user data or how to classify data in general in different organizations. This is an interesting problem in software engineering , because is an issue that probably the algorithms that we can use to process data are not per say hard to implement. Not even hard to create, yet the biggest issue is that we start to run out of a crucial resource which it is the memory. Therefor the biggest problem we need to handle is a way to make sure we don't spend the whole memory of a server or even the computation power of that server. Therefor this is where creating a parallel system or a cluster of servers that can work in parallel on given data can certainly help us. We can see that when there was a focus on algorithms, that we thought we should focus on a way to make the algorithm run faster and with less resources. Yet if we have an insane of data the algorithm at a certain point will fail. This is why parallel programming is becoming more and more important for some necessities , and this is related to a article that we saw previously. The free lunch is over, we thought that as the way that data will start to escalate our hardware will escalate with it. Yet this is not the case, in matter of fact it has escalated in creating more threads therefore causing us to focus more on parallel and concurrent programming. Even when having the perfect algorithm , having the correct concurrent architecture is essential for data processing.

Comments

Popular posts from this blog

Thread 1

Thread 4

Thread 8