#include <algorithm>
#include <print>
#include <vector>
int main() {
std::vector<double> data = {1.0,2.0,3.0,4.0};
// Default behaviour of reduce is to "+", however that is defined for your data type
auto sum = std::reduce(data.begin(), data.end());
std::print(std::stdout, "Sum: {}\n", sum);
return 0;
}
std::execution::par into the first parameter in std::reduce - this is a directive to say “parallelise this if you can”
reduce is to + which is associative, but for non associative operations more care must be taken// ...
hpx::execution::experimental::num_cores nc(2);
auto sum = hpx::reduce(hpx::execution::par.with(nc), data.begin(), data.end())
// ...
O(nlogn), has a large constant coefficient
O(n) search (going down the list and checking each element) whereas sorting gives us O(logn) binary search which is more efficientstd::unique is a better solution - it removes consecutively equal elements from a range in place and returns an iterator to the new logical end
std::unique is O(n)