Several header files supporting multi-threading in C++11.0 standard library are introduced and analyzed in previous articles respectively: <thread>, <mutex>, <condition_variable>. Then the next step is to take advantage of the heat to strike the iron. In this article, we will analyze the asynchronous operation of multi-threading.
Multi-threaded asynchronous operation source code in the terminal file <future>. Let's first look at what classes and functions are defined in <future>.
classes | future | future_error | packaged_task | promise | shared_future |
enum classes | future_errc | future_status | launch | ||
functions | async | future_categoy |
As you can see from the table above, there are six classes, three enum class es and two function s for the external interfaces provided in <future>. In this article, we will first analyze several color-marked interfaces in the table above, and the rest will be analyzed in the next blog post.
std::future
Future provides a mechanism for accessing the results of asynchronous operations. When an asynchronous operation is impossible to get the result immediately, it can be obtained by waiting synchronously, and the result of the asynchronous operation can be obtained by querying the future_status of the future.
The state of future is defined in future_status as follows:
enum class future_status { ready, // Asynchronous operation completed, shared state changed to ready timeout, // The asynchronous operation timed out and the shared state did not change to ready within the specified time deferred // The asynchronous operation hasn't started yet, and the shared state contains a deferred function };
Several methods are provided in class future: get(), wait(), wait_for(), wait_until(). Before introducing future, let's introduce a function: std::async.
std::async
Returns a future object whose associated asynchronous state manages a function object. Perhaps conceptually we can't grasp anything yet. Let's look at a piece of code.
#include <iostream> // std::cout #include <future> // std::async, std::future using namespace std; // This is a function for testing prime numbers. bool is_prime(int x) { cout << "Calculating. Please, wait...\n"; for (int i = 2; i < x; ++i) { if (x % i == 0) { return false; } } return true; } int main() { // Asynchronously calling the function is_prime(313222313) future<bool> fut = async(is_prime, 313222313); cout << "Checking whether 313222313 is prime.\n"; bool ret = fut.get(); // Waiting for is_prime to return if (ret) { cout << "It is prime!\n"; } else { cout << "It is not prime.\n"; } return 0; }
As we can see from the code, async associates the is_prime function with an input parameter of 31322313, and async returns a future object associated with an asynchronous state. After the is_prime function is associated, the main function and is_prime function are executed asynchronously (in fact, two threads). The return value of is_price can be obtained by the get() method of the future object (the fut in the code). Note here that fut.get() is called in main, and if the is_prime function has been executed, its return value can be obtained directly; if the is_prime function has not been executed, it will block until its return value is obtained.
In fact, the first parameter in std::async can set the asynchronous startup strategy of the correlation function, which can be set as follows:
These four asynchronous startup strategies are defined as follows, and I'll talk about them later.future<bool> fut = async(luanch::async, is_prime, 313222313); // 1 future<bool> fut = async(luanch::defered, is_prime, 313222313); // 2 future<bool> fut = async(luanch::any, is_prime, 313222313); // 3 future<bool> fut = async(luanch::sync, is_prime, 313222313); // 4
So much, let's summarize std::async:// Asynchronous Startup Strategy enum class launch { async = 0x1, // Asynchronous startup, when std::async() is called, a new thread is created to call the function asynchronously, and the future object is returned. deferred = 0x2, // Delayed startup. Threads are not created when std::async() is invoked. Threads are not created until the get() or wait() methods of the future object are invoked. any = async | deferred, // Automatically, functions automatically select policies at a given time, depending on the implementation of systems and libraries, usually to optimize the availability of current concurrency in the system sync = deferred };
1. Asynchronous interface std::async can automatically create threads to call thread functions and return a std:: fusion object, which can easily obtain the execution results of threads.
2. Provides an asynchronous startup strategy for functions that can delay startup.
So far, let's go back to std::future. In fact, the acquisition of a future object can be achieved in three ways:
1,async
2,promise::get_future
3,packaged_task::get_future
Among them, method 1, which we have already touched on in std::async, will be discussed in the following methods 2 and 3 when we explain std::promise and std::package_task.
Next, we continue to analyze several methods of std::future objects: get(), wait(), wait_for(), wait_until().
std::future::get()
1. When the shared state is ready, the return value is stored in the shared state or an exception is thrown.
2. When the shared state is not ready, the thread will be blocked and waited until ready.
3. Once the shared state is ready, the function will not block. It will return (or throw) the shared state and release the shared state so that the future object is no longer valid, that is, the member function can be called at most once in the shared state of each future object.
std::future::wait()
1. Wait for the shared state to be ready. If the shared state is not ready, the thread is blocked and waited until ready.
2. Once the shared state is ready, the function will not block and return its value (neither reading nor throwing an exception).
Let's take an example:
std::future::wait_for()#include <iostream> // std::cout #include <future> // std::async, std::future #include <chrono> // std::chrono::milliseconds using namespace std; bool is_prime (int x) { for (int i = 2; i < x; ++i) { if (x % i == 0) { return false; } } return true; } int main () { future<bool> fut = async (is_prime, 194232491); cout << "checking...\n"; fut.wait(); // Waiting for the shared state to be ready cout << "\n194232491 "; if (fut.get()) // Because wait() ensures that the shared state is ready, get() will not block. cout << "is prime.\n"; else cout << "is not prime.\n"; return 0; }
1. Waiting for the shared state to be ready within a certain period of time;
2. If the shared state is not ready, the thread will be blocked and waited until it is ready or rel_time is set.
For instance:
std::future::wait_until()#include <iostream> // std::cout #include <future> // std::async, std::future #include <chrono> // std::chrono::milliseconds bool is_prime (int x) { for (int i = 2; i < x; ++i) { if (x % i == 0) { return false; } } return true; } int main () { future<bool> fut = async (is_prime, 700020007); cout << "checking, please wait"; chrono::milliseconds span (100); while (fut.wait_for(span) == future_status::timeout) { // If timeout occurs, wait_for continues cout << '.'; } bool x = fut.get(); // This ensures that the shared state is ready, so get() will not block. cout << "\n700020007 " << (x ? "is" : "is not") << " prime.\n"; return 0; }
1. Wait for the shared state to be ready until the designated time point arrives.
2. If the shared state is not ready, the thread will be blocked and waited until it is ready or at a specified time.
OK, that's the use of std::future. In addition, we can note that future_status::timeout is used in the above code to indicate why the function returns, and future_staus is an enum class, which is defined as follows:
enum class future_status { ready, // Shared state ready timeout, // overtime deferred // Shared state includes latency (std::async uses the parameter std::launch::defered) };