BloombergGPT: How We Built a 50 Billion Parameter Financial Language Model - YouTube |
add What are Main Use Cases of AI?
Tutorial #llm #bloomberggpt
This talk by David Rosenberg, Head of ML Strategy, Office of the CTO, Bloomberg covers #BloombergGPT, an experimental project by Bloomberg to create a ChatGPT-like large-language-model (#LLM) that serves both general purpose as well as domain-specific purpose.
BloombergGPT is a 50-billion parameter LLM built using 570 billion tokens of language data, half of data are public, the other half are private.
Areas that BloombergGPT performed better than peers are:
Get me the last price and market cap for Apple
, expect output of get(px_last, cur_mkt_cap) for (['AAPL US Equity'])
. Without previous knowledge of BQL, with few-shot learning
of 3 example pairs of input and output, BloombergGPT can produce expected output correctly given input subsequently.market cap of AAPL vs. MSFT
Terms of Use: You are in agreement with our Terms of Services and Privacy Policy. If you have any question or concern to any information published on SaveNowClub, please feel free to write to us at savenowclub@gmail.com