It is pretty clever to suggest objective morality without specifying an actual moral code, as it is always the specifics that cause problems.
My issue would be how Eliezer appears to suggest that human morality and alien morality could be judged separately from the genetics of each. Would super intelligent alien bees have the same notions of fairness as we do, and could we simply transplant our morality onto them, annd judge them accordingly, with no adjustments made for biological differences? I think it is very likely that such a species would consider the most fair distribution of a found pie to be one that involved a sizeable portion going to the queen, and that a worker who disagreed would be acting immorally. Is this something that we can safely say is objectively wrong?
I’m not sure I understand at what point the torture would no longer be justified. It’s easy to say that it is preferable to a googolplex of people with dust specks is worse than one person being tortured, but there has to be some number at which this is no longer the case. At some point even your preferences should flip, but you never suggest a point where it would be acceptable. Would it be somewhere around 1.5-1.6 billion, assuming the dust specks were worth 1 second of pain? Is it acceptable if it is just 2 people affected? How many dust specks go into 1 year of torture? I think people would be more comfortable with your conclusion if you had some way to quantify it; right now all we have is your assertion that the math is in the dust speck’s favor.