- u is the size of data
- b is the network bandwidth (bps)
- f(u) is the time taken to transfer u bytes of data.
- theta1*u refers to amount of time taken to compress u bytes of data
- theta2*u*b refers to time taken to transfer compressed data.
- theta3 is constant protocol overhead
I want to be able to optimize the transfer by minimizing time and maximizing uncompressed size. If I were to simply minimize f(u) that would have been easy but then in this particular case , I would also want to maximize amount of data to be transferred. The idea is to be able to come up with a machine learning algorithm which can change as per different network scenarios and fluctuating bandwidth , rather than user providing it at start of program. Of course, I would implement it with some sane seed values and keep them within sane boundaries. Solution space Given the nature of this problem , which class of algorithms should I be looking into? Would gradient descent be the right way to approach this? is it possible to apply maximize and minimization on same function of 2 different variables? Am I over-thinking this ?
Asked By : rajeshnair
Answered By : Jonathan Silva
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/52071 3.2K people like this