Concurrently schedule a task for every command-line argument:
```python
import sys
from concurrent.futures import ThreadPoolExecutor, as_completed
def main():
with ThreadPoolExecutor(max_workers=len(sys.argv)) as executor:
tasks = {}
for arg in sys.argv:
future = executor.submit(MY_TASK_FUNCTION, arg)
tasks[future] = arg
for future in as_completed(tasks):
result = future.result()
arg = tasks[future]
print(f"Arg: {arg}\nResult: {result}\n--")
if __name__ == "__main__":
import time
global_start_time = time.time()
main()
global_elapsed_time = time.time() - global_start_time
print(f"Total time taken for all tasks: {global_elapsed_time:.4f} seconds")
```
What would a fully parallel version with multiprocessing look like? Pretty much the same, but using the `ProcessPoolExecutor` instead.
If the function to submit to the executor has complicated arguments to be passed to it, use a lambda or `functools.partial`.
Simplistic example blocking `MY_TASK_FUNCTION` routine `fetch_title(url)` that fetches titles from URLs:
```python
import re
from urllib.request import Request, urlopen
title_pattern = re.compile(r"<title[^>]*>(.*?)</title>", re.IGNORECASE)
user_agent = (
"Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/116.0"
)
def fetch_title(url):
headers = {"User-Agent": user_agent}
with urlopen(Request(url, headers=headers)) as response:
html_content = response.read().decode("utf-8")
match = title_pattern.search(html_content)
title = match.group(1) if match else "Unknown"
return title
```