《人工智能(Artificial Intelligence)概述.pdf》由会员分享,可在线阅读,更多相关《人工智能(Artificial Intelligence)概述.pdf(99页珍藏版)》请在taowenge.com淘文阁网|工程机械CAD图纸|机械工程制图|CAD装配图下载|SolidWorks_CaTia_CAD_UG_PROE_设计图分享下载上搜索。
1、Artifi cial intelligence is the apex technology of the information era. In the latest in our Profi les in Innovation series, we examine how advances in machine learning and deep learning have combined with more powerful computing and an ever-expanding pool of data to bring AI within reach for compan
2、ies across industries. The development of AI-as-a-service has the potential to open new markets and disrupt the playing fi eld in cloud computing. We believe the ability to leverage AI will become a defi ning attribute of competitive advantage for companies in coming years and will usher in a resurg
3、ence in productivity. Heath P . Terry, CFA (212) 357-1849 Goldman, Sachs identify clusters of unusual behavior. Predictive. Predict the likelihood of customer or employee churn based on web activity and other metadata; predict health issues based on wearable data. What is General, Strong or True Art
4、ificial Intelligence? General, Strong, or True Artificial Intelligence are terms used for machine intelligence that fully replicates human intelligence including independent learning and decision making. While techniques like Whole Brain Emulation are being used to work towards the goal of General A
5、I, the amount of compute power required is still considered far beyond current technologies, making General AI largely theoretical for the time being. via:资料来源网络,起点学院学员收集 起点学院,互联网黄埔军校,打造最专业最系统的产品、运营、交互课程。 November 14, 2016 Profiles in Innovation Goldman Sachs Global Investment Research 11 Key driver
6、s of value creation We believe profit pool creation (and destruction) related to the AI theme is best analyzed by first breaking AI down into four key inputs: talent, data, infrastructure and silicon. These inputs also double as barriers to adoption. Talent AI (deep learning in particular) is hard.
7、Per our conversations with VCs and companies in the space, this has created a talent shortage and a competition for this talent among large internet and cloud computing vendors (Exhibit 5). AI talent is in high enough demand that “acquihires” are still a common means to acquire necessary talent. As
8、the technology and tooling matures, talent may become less of a bottleneck. However, we believe talent will migrate to interesting, differentiated data sets. Due to this, we believe large differentiated data sets are the most likely driver of growth and incremental profit dollars as we move into an
9、AI-centric world. Exhibit 5: A Scarcity of AI Talent is Driving M 0= least accurate, 100= most accurate Source: Department of Radiology at Massachusetts General Hospital and Harvard Medical School Most deep learning today is either supervised or semi-supervised, meaning all or some of the data utili
10、zed to train the model must be labeled by a human. Unsupervised machine learning is the current “holy grail” in AI, as raw un-labeled data could be utilized to train models. Broad adoption of deep learning will likely be tied to growth in large data sets (which is happening due to mobile and IoT) an
11、d to advances in unsupervised machine learning. However, we believe large differentiated data sets (electronic health records, omics data, geological data, weather data, etc.) will likely be a core driver of profit pool creation over the next decade. The amount of information created worldwide is ex
12、pected to increase at a CAGR of 36% through 2020, reaching 44 Zettabytes (44 billion GB), according to IDC. Increases in connected devices (consumer and industrial), machine-to-machine communication, and remote sensors are combining to create large data sets that can then be mined for insights and t
13、o train adaptive algorithms. Availability of data has also increased dramatically in the last decade, with census, labor, weather, and even genome data available for free online in large quantities. We are also seeing increased availability of satellite imagery, which requires a great deal of comput
14、e to fully analyze. The US Geological Surveys Landsat 7 and Landsat 8 satellites image the entire Earth every 8 days, and the USGS makes those images available for free though even when compressed, the ultra-high definition images are approximately 1GB each, in file size. Other companies, like Orbit
15、al Insights are aggregating image data and creating commercial solutions across multiple industries. Infrastructure Hardware and infrastructure software are necessary to make AI work. We believe infrastructure to support AI will rapidly become commoditized. This view is based on two observations: 1)
16、 cloud computing vendors are well positioned to extend their offerings into AI infrastructure, 2) open source (TensorFlow, Caffe, Spark, etc.) has emerged as the primary driver of software innovation in AI. To spur adoption of AI, we believe large cloud vendors will continue to open source infrastru
17、cture capabilities, limiting the potential for profit pool creation. Training Data Size 5102050100200 Brain0.33.3945.7159.0772.8298.44 Neck21.330.6379.9799.3499.7499.33 Shoulder2.9821.3969.6486.5795.5392.94 Chest23.3934.4562.5396.1895.2599.61 Abdomen0.13.2335.465.8391.0195.18 Pelvis01.1515.9955.983.
18、788.45 Average8.0117.3751.5477.1589.6895.67 via:资料来源网络,起点学院学员收集 起点学院,互联网黄埔军校,打造最专业最系统的产品、运营、交互课程。 November 14, 2016 Profiles in Innovation Goldman Sachs Global Investment Research 13 Exhibit 7: Internet Giants (such as Google) are spurring interest in AI via open sourcing technologies (such as Tenso
19、rFlow) GitHub repositories most starred 2015-2016 Source: GitHub Silicon The repurposing of GPUs for deep learning has been one of the key drivers of our current “AI Spring”. Within the AI/ML ecosystem, there are two primary applications that determine the performance of a neural network with each r
20、equiring a different resource setup. The first is the construction and use of a training algorithm. The training algorithm leverages a large (usually the larger, the better) data set to find correlations and build a model that can determine the probability of an output, given a new input. Training i
21、s very resource-intensive, and most modern training is done on GPU-powered systems. The use of models and algorithms once they have been trained is referred to as inference. Inference requires far less computing power, and typically combs through smaller, incremental data input sets. While some GPUs
22、 are optimized for inference (Nvidias P4 series and M4 series, for example) given the single-purpose nature of inference, specialized silicon is being developed specifically for that application, referred to as FPGAs (Field Programmable Gate Array) and ASICs (Application Specific Integrated Circuit)
23、. This type of integrated circuit was originally developed for prototyping CPUs, but is increasingly being used for inference in artificial intelligence. Googles Tensor Processing Unit, is an example of an ASIC purpose-built for AI and machine learning. Microsoft has been using FPGA chips for infere
24、nce, as well. Intel acquired FPGA manufacturer, Altera, in 2015 on the view that by 2020, a third of data centers could be leveraging FPGAs for specialized use cases. Xilinx, which pioneered commercially viable FPGAs in the 1980s, has pointed to the cloud and large data centers as a significant aven
25、ue of growth going forward, having announced a strategic customer relationship with Baidu. Data centers make up roughly 5% of Xilinxs revenue now. 0 5000 10000 15000 20000 25000 30000 35000 40000 45000 Facebook React Native (a native app framework) Apple Swift (a programming language) Tensorflow (a
26、library for machine learning) StarsForks via:资料来源网络,起点学院学员收集 起点学院,互联网黄埔军校,打造最专业最系统的产品、运营、交互课程。 November 14, 2016 Profiles in Innovation Goldman Sachs Global Investment Research 14 Exhibit 8: Evolution of AI: 1950-Present Source: Company data, Goldman Sachs Global Investment Research via:资料来源网络,起点学院学
27、员收集 起点学院,互联网黄埔军校,打造最专业最系统的产品、运营、交互课程。 Fueling the future of productivity Labor productivity growth in the U.S. has come to a halt in recent years after modest growth in the past decade and significant growth in the mid-late 1990s. We believe that proliferation of consumable machine learning and AI h
28、as the potential to dramatically shift the productivity paradigm across global industries, in a way similar to the broad scale adoption of internet technologies in the 1990s. Across industries, we see a roughly 0.5%-1.5% reduction in labor hours spurred by automation and efficiency gains brought to
29、bear by AI/ML technologies resulting in a +51- 154bps impact on productivity growth by 2025. While we expect AI/ML to improve both the denominator and numerator of productivity over time, we believe the most significant, early impacts will be on the automation of lower-wage tasks driving similar lev
30、els of output growth with less labor hours. Our base case AI/ML driven improvement of 97 bps implies a 2025 productivity growth IT contribution of 1.61%, or 11bps higher than 1995- 2004 (Exhibits 9, 10). Exhibit 9: Productivity analysis $ millions, assumes linear nominal GDP growth beyond 2019 Sourc
31、e: OECD, US Bureau of Labor Statistics, Goldman Sachs Global Investment Research Technology and productivity growth The 1990s technology boom saw unusual amplification of each of the two primary components of productivity, capital deepening and multifactor productivity (MFP), and was strongly correl
32、ated with rising equity valuations. Capital Deepening. GS economist Jan Hatzius has provided recent analysis on the anti- cyclical tendency of capital deepening (capital stock per labor hour), as labor hours historically tend to rise during expansionary periods without an equal surge in capital stoc
33、k (see Jans report: “Productivity Paradox v2.0 Revisited”, published on 09/2/2016). In the 1990s, capital deepening increased markedly, highlighted by atypical capital investment increases that outpaced growth in the labor market. Multifactor productivity (MFP). A March, 2013 Federal Reserve study b
34、y David Byrne et al. suggests that the simultaneous diffusion of technology into IT-producing and general US2016E2017E2018E2019E2020E2021E2022E2023E2024E2025E Output USNominalGDP*($bn)18,55219,30020,04520,75721,47022,18322,89523,60824,32125,034 2.9%4.0%3.9%3.6%3.4%3.3%3.2%3.1%3.0%2.9% Productivity L
35、aborproductivity69.070.471.873.174.375.476.577.678.679.7 yoygrowth(%)0.9%2.1%2.0%1.7%1.6%1.6%1.5%1.4%1.3%1.3% Laborhours(mn)268,958273,992279,026284,060289,094294,128299,162304,196309,230314,264 ML/AIimpact LowBaseHigh Laborhoursreduction(mn)(1,571)(2,969)(4,714) Reduction0.5%1%1.5% 2025Laborhours(m
36、n)312,693311,295309,550 2025GDP($bn)25,03425,03425,034 Laborproductivity80.180.480.9 yoygrowth(%)1.8%2.2%2.8% Improvement(bps)5197154 via:资料来源网络,起点学院学员收集 起点学院,互联网黄埔军校,打造最专业最系统的产品、运营、交互课程。 November 14, 2016 Profiles in Innovation Goldman Sachs Global Investment Research 16 operations processes contri
37、buted to creating a threefold spike in growth (output per labor hour) during the 1990s, with IT-producing sectors responsible for at most 49% of the average annual increase in annual productivity growth from the pre-boom period to the period between 1995 and 2004 (Exhibit 10). Exhibit 10: Late 90s:
38、IT-producing sectors contribute nearly half of productivity growth But lose value and share in growth post-tech boom Source: Federal Reserve Board, Goldman Sachs Global Investment Research Post-millennium stagnation. During the past decade, capital deepening growth related to IT applications (comput
39、er hardware, software, and telecom) has stagnated. IT capital, relative to broader market capital, has contributed less to overall growth in this component than average contributions during and even before the tech boom. Aggregate labor hours have been increasing, but the contribution of capital int
40、ensity to productivity has drastically underperformed versus the 1990s. The introduction of increasingly sophisticated, consumable machine learning and AI may be a catalyst in bringing capital intensity back to the forefront, in our view, significantly increasing the productivity of labor similar to
41、 the cycle we saw in the 1990s. Were more optimistic on the MFP side of the equation. GS economists have highlighted (Productivity Paradox v2.0 Revisited, 9/2/2016) that upward biases on ICT prices and a growth in inputs towards unmonetized outputs (free online content, back-end processes, etc.) add
42、 to the understatement of real GDP and productivity growth. Evolution of internet giants like Facebook and Google highlight the idea that complex input labor and capital arent necessarily converted into traditional consumer product monetization captured in standard productivity metrics. AI/ML induce
43、d productivity could impact investment We believe that one of the potential impacts of increasing productivity from AI/ML could be a shift in the way companies allocate capital. Since mid-2011, the growth in dividends and share repurchases has significantly exceeded capex growth, as reluctance among
44、 management teams to investment in capital projects remains post-recession. 0.77 1.5 0.64 0.79 1.56 0.92 0 1 2 3 4 1974-19951995-20042004-2012 IT ContributionOther nonfarm business % growth 1.56% total average growth virtually equal to 1995-2004 average IT contribution via:资料来源网络,起点学院学员收集 起点学院,互联网黄埔
45、军校,打造最专业最系统的产品、运营、交互课程。 November 14, 2016 Profiles in Innovation Goldman Sachs Global Investment Research 17 Exhibit 11: Companies are hesitant to sacrifice dividends Clear shift in cash utilization strategy Exhibit 12: Cyclically adj. P/E ratios in a sluggish recovery Valuations only just hitting p
46、re-recession levels Source: Shiller S processer speeds, more memory, better attributes in computer hardware, which led to large increases in the measured contribution of the technology sector. The technology sector was very central to pick up in the productivity numbers from the 1990s lasting to the
47、 early and mid-2000s. Terry: We ve seen a lot of technology development over the last 10-15 years. Why hasn t there been a similar impact to productivity from technologies such as the iPhone, Facebook, and the development of cloud computing? Hatzius: We don t have a full answer to it, but I do think
48、 an important part of the answer is the statistical ability to measure improvement in quality, and the impact of new products in the economic statistics is limited. It s relatively easy to measure nominal GDP, that s basically a matter of adding up receipts. There is room for measurement error as th
49、ere is in almost everything, but I don t have real first order concern that measurement is getting worse in terms of measuring nominal GDP. Converting nominal GDP numbers into real GDP numbers by deflating it with a quality adjusted overall price index is where I think things get very difficult. If you look, for example, at the way software sectors enter the official numbers, if you believe the official numbers, $1000 of outlay on software now buys you just as much real software as $1000 of outlay bought you in the 1990s. The