- Parallel Requests: If having access to MATLAB's Parallel Computing Toolbox, try to split requests into smaller batches and use parallel processing to download them concurrently.
- Limit Fields: Review the fields being requested. If there are fields that are not strictly needed, consider removing them from request to reduce the amount of data being transferred.
- Bloomberg API Limitations: The Bloomberg API itself might have limitations on data retrieval rates, which can affect the time taken to download historical data. Check if there are any such limitations and if they are configurable.
- Caching: If frequently need to access the same historical data, consider caching it locally after the initial download so that only need to retrieve incremental updates on subsequent requests.
- Data Compression: Check if the Bloomberg service allows for data compression options during transfer, which could reduce the amount of data being sent over the network.
- Optimize MATLAB Code: Review MATLAB code for any inefficiencies. For example, preallocate memory for large datasets, use efficient data types, and avoid unnecessary loops or computations within data retrieval script.
- Incremental Downloads: Instead of downloading the entire dataset in one go, consider breaking it down into smaller time frames and incrementally appending the results. This approach can sometimes be more efficient and allows for intermediate saving and checkpointing.
- https://www.mathworks.com/help/datafeed/blp.history.html?searchHighlight=history&s_tid=srchtitle_support_results_1_history
- https://www.mathworks.com/help/parallel-computing/index.html?searchHighlight=parallel%20computing%20toolbox&s_tid=srchtitle_support_results_1_parallel%20computing%20toolbox