Quant 界都有哪些比较有名的jquery股票交易界面或研究比赛

Quant 界都有哪些比较有名的交易或研究比赛? - 知乎有问题,上知乎。知乎作为中文互联网最大的知识分享平台,以「知识连接一切」为愿景,致力于构建一个人人都可以便捷接入的知识分享网络,让人们便捷地与世界分享知识、经验和见解,发现更大的世界。<strong class="NumberBoard-itemValue" title="被浏览<strong class="NumberBoard-itemValue" title="3分享邀请回答3添加评论分享收藏感谢收起&figure&&img src=&https://pic4.zhimg.com/50/v2-9f02b2f07b0aa7a8372a_b.jpg& data-caption=&& data-size=&normal& data-rawwidth=&929& data-rawheight=&742& class=&origin_image zh-lightbox-thumb& width=&929& data-original=&https://pic4.zhimg.com/50/v2-9f02b2f07b0aa7a8372a_r.jpg&&&/figure&&div class=&highlight&&&pre&&code class=&language-python3&&&span class=&kn&&import&/span& &span class=&nn&&turtle&/span&
&span class=&kn&&import&/span& &span class=&nn&&random&/span&
&span class=&kn&&from&/span& &span class=&nn&&turtle&/span& &span class=&k&&import&/span& &span class=&o&&*&/span&
&span class=&kn&&from&/span& &span class=&nn&&time&/span& &span class=&k&&import&/span& &span class=&n&&sleep&/span&
&span class=&n&&t&/span& &span class=&o&&=&/span& &span class=&n&&turtle&/span&&span class=&o&&.&/span&&span class=&n&&Turtle&/span&&span class=&p&&()&/span&
&span class=&n&&w&/span& &span class=&o&&=&/span& &span class=&n&&turtle&/span&&span class=&o&&.&/span&&span class=&n&&Screen&/span&&span class=&p&&()&/span&
&span class=&k&&def&/span& &span class=&nf&&tree&/span&&span class=&p&&(&/span&&span class=&n&&branchLen&/span&&span class=&p&&,&/span& &span class=&n&&t&/span&&span class=&p&&):&/span&
&span class=&k&&if&/span& &span class=&n&&branchLen&/span& &span class=&o&&&&/span& &span class=&mi&&3&/span&&span class=&p&&:&/span&
&span class=&k&&if&/span& &span class=&mi&&8&/span& &span class=&o&&&=&/span& &span class=&n&&branchLen&/span& &span class=&o&&&=&/span& &span class=&mi&&12&/span&&span class=&p&&:&/span&
&span class=&k&&if&/span& &span class=&n&&random&/span&&span class=&o&&.&/span&&span class=&n&&randint&/span&&span class=&p&&(&/span&&span class=&mi&&0&/span&&span class=&p&&,&/span& &span class=&mi&&2&/span&&span class=&p&&)&/span& &span class=&o&&==&/span& &span class=&mi&&0&/span&&span class=&p&&:&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&color&/span&&span class=&p&&(&/span&&span class=&s&&'snow'&/span&&span class=&p&&)&/span&
&span class=&k&&else&/span&&span class=&p&&:&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&color&/span&&span class=&p&&(&/span&&span class=&s&&'lightcoral'&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&pensize&/span&&span class=&p&&(&/span&&span class=&n&&branchLen&/span& &span class=&o&&/&/span& &span class=&mi&&3&/span&&span class=&p&&)&/span&
&span class=&k&&elif&/span& &span class=&n&&branchLen&/span& &span class=&o&&&&/span& &span class=&mi&&8&/span&&span class=&p&&:&/span&
&span class=&k&&if&/span& &span class=&n&&random&/span&&span class=&o&&.&/span&&span class=&n&&randint&/span&&span class=&p&&(&/span&&span class=&mi&&0&/span&&span class=&p&&,&/span& &span class=&mi&&1&/span&&span class=&p&&)&/span& &span class=&o&&==&/span& &span class=&mi&&0&/span&&span class=&p&&:&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&color&/span&&span class=&p&&(&/span&&span class=&s&&'snow'&/span&&span class=&p&&)&/span&
&span class=&k&&else&/span&&span class=&p&&:&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&color&/span&&span class=&p&&(&/span&&span class=&s&&'lightcoral'&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&pensize&/span&&span class=&p&&(&/span&&span class=&n&&branchLen&/span& &span class=&o&&/&/span& &span class=&mi&&2&/span&&span class=&p&&)&/span&
&span class=&k&&else&/span&&span class=&p&&:&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&color&/span&&span class=&p&&(&/span&&span class=&s&&'sienna'&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&pensize&/span&&span class=&p&&(&/span&&span class=&n&&branchLen&/span& &span class=&o&&/&/span& &span class=&mi&&10&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&forward&/span&&span class=&p&&(&/span&&span class=&n&&branchLen&/span&&span class=&p&&)&/span&
&span class=&n&&a&/span& &span class=&o&&=&/span& &span class=&mf&&1.5&/span& &span class=&o&&*&/span& &span class=&n&&random&/span&&span class=&o&&.&/span&&span class=&n&&random&/span&&span class=&p&&()&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&right&/span&&span class=&p&&(&/span&&span class=&mi&&20&/span&&span class=&o&&*&/span&&span class=&n&&a&/span&&span class=&p&&)&/span&
&span class=&n&&b&/span& &span class=&o&&=&/span& &span class=&mf&&1.5&/span& &span class=&o&&*&/span& &span class=&n&&random&/span&&span class=&o&&.&/span&&span class=&n&&random&/span&&span class=&p&&()&/span&
&span class=&n&&tree&/span&&span class=&p&&(&/span&&span class=&n&&branchLen&/span&&span class=&o&&-&/span&&span class=&mi&&10&/span&&span class=&o&&*&/span&&span class=&n&&b&/span&&span class=&p&&,&/span& &span class=&n&&t&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&left&/span&&span class=&p&&(&/span&&span class=&mi&&40&/span&&span class=&o&&*&/span&&span class=&n&&a&/span&&span class=&p&&)&/span&
&span class=&n&&tree&/span&&span class=&p&&(&/span&&span class=&n&&branchLen&/span&&span class=&o&&-&/span&&span class=&mi&&10&/span&&span class=&o&&*&/span&&span class=&n&&b&/span&&span class=&p&&,&/span& &span class=&n&&t&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&right&/span&&span class=&p&&(&/span&&span class=&mi&&20&/span&&span class=&o&&*&/span&&span class=&n&&a&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&up&/span&&span class=&p&&()&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&backward&/span&&span class=&p&&(&/span&&span class=&n&&branchLen&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&down&/span&&span class=&p&&()&/span&
&span class=&k&&def&/span& &span class=&nf&&petal&/span&&span class=&p&&(&/span&&span class=&n&&m&/span&&span class=&p&&,&/span& &span class=&n&&t&/span&&span class=&p&&):&/span&
&span class=&c&&# 树下花瓣&/span&
&span class=&k&&for&/span& &span class=&n&&i&/span& &span class=&ow&&in&/span& &span class=&nb&&range&/span&&span class=&p&&(&/span&&span class=&n&&m&/span&&span class=&p&&):&/span&
&span class=&n&&a&/span& &span class=&o&&=&/span& &span class=&mi&&200&/span& &span class=&o&&-&/span& &span class=&mi&&400&/span& &span class=&o&&*&/span& &span class=&n&&random&/span&&span class=&o&&.&/span&&span class=&n&&random&/span&&span class=&p&&()&/span&
&span class=&n&&b&/span& &span class=&o&&=&/span& &span class=&mi&&10&/span& &span class=&o&&-&/span& &span class=&mi&&20&/span& &span class=&o&&*&/span& &span class=&n&&random&/span&&span class=&o&&.&/span&&span class=&n&&random&/span&&span class=&p&&()&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&up&/span&&span class=&p&&()&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&forward&/span&&span class=&p&&(&/span&&span class=&n&&b&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&left&/span&&span class=&p&&(&/span&&span class=&mi&&90&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&forward&/span&&span class=&p&&(&/span&&span class=&n&&a&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&down&/span&&span class=&p&&()&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&color&/span&&span class=&p&&(&/span&&span class=&s&&&lightcoral&&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&circle&/span&&span class=&p&&(&/span&&span class=&mi&&1&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&up&/span&&span class=&p&&()&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&backward&/span&&span class=&p&&(&/span&&span class=&n&&a&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&right&/span&&span class=&p&&(&/span&&span class=&mi&&90&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&backward&/span&&span class=&p&&(&/span&&span class=&n&&b&/span&&span class=&p&&)&/span&
&span class=&k&&def&/span& &span class=&nf&&main&/span&&span class=&p&&():&/span&
&span class=&n&&t&/span& &span class=&o&&=&/span& &span class=&n&&turtle&/span&&span class=&o&&.&/span&&span class=&n&&Turtle&/span&&span class=&p&&()&/span&
&span class=&n&&myWin&/span& &span class=&o&&=&/span& &span class=&n&&turtle&/span&&span class=&o&&.&/span&&span class=&n&&Screen&/span&&span class=&p&&()&/span&
&span class=&n&&getscreen&/span&&span class=&p&&()&/span&&span class=&o&&.&/span&&span class=&n&&tracer&/span&&span class=&p&&(&/span&&span class=&mi&&5&/span&&span class=&p&&,&/span& &span class=&mi&&0&/span&&span class=&p&&)&/span&
&span class=&n&&turtle&/span&&span class=&o&&.&/span&&span class=&n&&screensize&/span&&span class=&p&&(&/span&&span class=&n&&bg&/span&&span class=&o&&=&/span&&span class=&s&&'wheat'&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&left&/span&&span class=&p&&(&/span&&span class=&mi&&90&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&up&/span&&span class=&p&&()&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&backward&/span&&span class=&p&&(&/span&&span class=&mi&&150&/span&&span class=&p&&)&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&down&/span&&span class=&p&&()&/span&
&span class=&n&&t&/span&&span class=&o&&.&/span&&span class=&n&&color&/span&&span class=&p&&(&/span&&span class=&s&&'sienna'&/span&&span class=&p&&)&/span&
&span class=&n&&tree&/span&&span class=&p&&(&/span&&span class=&mi&&60&/span&&span class=&p&&,&/span& &span class=&n&&t&/span&&span class=&p&&)&/span&
&span class=&n&&petal&/span&&span class=&p&&(&/span&&span class=&mi&&100&/span&&span class=&p&&,&/span& &span class=&n&&t&/span&&span class=&p&&)&/span&
&span class=&n&&myWin&/span&&span class=&o&&.&/span&&span class=&n&&exitonclick&/span&&span class=&p&&()&/span&
&span class=&n&&main&/span&&span class=&p&&()&/span&
&/code&&/pre&&/div&
import turtle
import random
from turtle import *
from time import sleep
t = turtle.Turtle()
w = turtle.Screen()
def tree(branchLen, t):
if branchLen & 3:
if 8 &= branchLen &= 12:
if random.randint(0, 2) == 0:
t.color('snow')
我在博士期间研究过美式期权定价的数值方法,所以试着回答下这个问题。有不足之处,欢迎各位知友讨论。&br&&br&Black-Scholes模型是在1973年由芝加哥大学Black和Scholes提出的,其中涉及到著名的Black-Scholes偏微分方程。此微分方程在数学上为抛物型对流扩散(parabolic convection diffusion)方程,变量为原生资产(underlying asset,如股票等)和时间,参数为波动率和利率,均假设为常数。如果加上边界条件(期权在股价为0处的价格,以及在股价无穷大处的价格)和终值条件(期权在到期日的价格),那么,基于Black-Scholes模型欧式期权价格是可以通过偏微分方程的解析解得到。&br&&br&看到不少知友在这个问题上有争议。&b&Black-Scholes方程&/b&确实只能用于求解欧式期权定价,但是,美式期权定价是可以基于&b&Black-Scholes模型&/b&。这就意味着,由于可以提前实施,美式期权定价在数学上不是简单的求解一个偏微分方程问题,而是更为复杂的变分不等式(variational inequality)问题,或者互补问题(complementarity problem)。这些变分不等式或者互补问题,可以基于Black-Scholes微分算子。&br&&br&Binomial模型,或者Binomial Tree模型,中文翻译为二叉树模型,实际上是属于Tree模型的一类。Tree模型还有Trinomial Tree(三叉树)模型,Willow Tree(柳树)模型等。二叉树模型最早由Cox,Ross和Robinstein在1979年提出的,是一种数值定价欧式或美式期权的方法。相对于微分方程模型,树模型引入离散时间(discrete time),可以视为是对连续模型的离散化逼近。&br&&br&实际上,数值方法都是对连续模型的离散化逼近,在有限维的空间里寻找对真实解的近似。比如有限差分方法,有限体积法和有限元方法。因此,早就有学者证明(直观上也很容易理解),二叉树方法不过是一种特殊的显式有限差分格式。&br&&br&期权定价的数值方法包含两大类,一类是确定性方法,比如有限差分方法和(二叉)树方法等,另外一类是随机方法,也就是Monte Carlo模拟。下面说说这两类方法的优缺点。&br&&br&对于空间方向为5维以下的期权定价模型,比如Black-Scholes模型,Heston随机波动率模型,Kou或Merton的Jump-diffusion模型,文献中通常推荐采用确定性方法数值求解。这是因为确定性方法的精度比较高,且在理论上可以估计。相比于随机模拟的不确定性,差分方法的数值结果总是固定的,不会随着实验次数而改变。欧式期权的离散对应为每个时间层上一系列方程组的求解,美式期权则对应每个时间层上一系列线性互补问题的求解,都已经有成熟的高效算法可以应用。有限差分方法还有个优点在于,一旦求解出期权价格,可以很方便的求解期权的风险指标,如Delta,Gamma等。&br&&br&诚如知友所言,当模型维数大时,确定性方法,尤其是有限差分方法等,数值离散后的规模随着离散节点的增加呈现指数级增长,也就是所谓的维数灾难。对于高维问题,如多资产的期权模型,无论是理论还是实际中,都没人会考虑用确定性方法。我从文献中总结的经验是,当模型空间方向高于3维时,有学者就考虑采用交替方向法之类的降维方法进行求解,也就是把一个多维问题,分解成若干个一维问题(Black-Scholes模型)求解。&br&&br&这就是为什么业界用的最普遍的方法还是Monte Carlo方法。对于path-dependent(路径依赖)的各类奇异期权,以及multi-asset(多资产)期权模型,Monte Carlo方法直观有效。从理论上说,随机模拟方法效率和精度很低,但Monte Carlo算法中模拟路径部分相互独立,因此可以并行计算。通过大幅度提高随机次数,可以达到所需的精度。&br&&br&Monte Carlo方法有多慢?根据数值上的精度估计,Monte Carlo数值解误差与随机次数开根号分之一同阶。也就是说,若数值解要精确到小数点后面1位,需要试验100次;要精确到小数点后面2位,需要试验10000次;要精确到小数点后面3位,需要试验10^6次等等。国内外都有很多学者考虑对Monte Carlo随机模拟作加速,比如方差减小技术,控制变量技术等等。参见大牛 &a data-hash=&3fc0e5ac8e0e9ba49b3e7& href=&//www.zhihu.com/people/3fc0e5ac8e0e9ba49b3e7& class=&member_mention& data-editable=&true& data-title=&@袁浩瀚& data-hovercard=&p$b$3fc0e5ac8e0e9ba49b3e7&&@袁浩瀚&/a& 的回答“数学类第3题:提出至少三种Monte Carlo Simulation的Variance Reduction方法,并简单描述如何实现。”&br&&br&为了避免模型的数值离散随着维数增大而计算量急剧增大,除了采用Monte Carlo方法外,还可以采用无网格方法。区别于有网格方法(Lattice Method),无网格方法离散得到的问题规模仅与节点基函数的个数有关,与PDE维数无关。在我看来,这将是期权定价数值研究的新方向。由于和问题无关,这里不展开讨论了。
我在博士期间研究过美式期权定价的数值方法,所以试着回答下这个问题。有不足之处,欢迎各位知友讨论。 Black-Scholes模型是在1973年由芝加哥大学Black和Scholes提出的,其中涉及到著名的Black-Scholes偏微分方程。此微分方程在数学上为抛物型对流扩散(par…
谢邀,这几天实在太忙,没时间写技术答,偷懒将之前写的一篇爬虫入门贴过来吧。所贴代码皆为当时运行通过的,应该是最入门的历程了,是不是干货题主自行判断&br&----------------------------&br&作者:洪宸&br&链接:&a href=&https://www.zhihu.com/question//answer/& class=&internal&&Python 爬虫进阶? - 知乎用户的回答&/a&&br&来源:知乎&br&&br&&p&爬虫是在没有(用)API获取数据的情况下以Hack的方式获取数据的一种有效手段;进阶,就是从爬取简单页面逐渐过渡到复杂页面的过程。针对特定需求,爬取的网站类型不同,可以使用不同的python库相结合,达到快速抓取数据的目的。但是无论使用什么库,第一步分析目标网页的页面元素发现抓取规律总是必不可少的:有些爬虫是通过访问固定url前缀拼接不同的后缀进行循环抓取,有些是通过一个起始url作为种子url继而获取更多的目标url递归抓取;有些网页是静态数据可以直接获取,有些网页是js渲染数据需要构造二次请求……如果统统都写下来,一篇文章是不够的,这里举几个典型的栗子:&/p&&br&&p&1. 页面url为固定url前缀拼接不同的后缀:&/p&以从OPENISBN网站抓取图书分类信息为例,我有一批图书需要入库,但是图书信息不全,比如缺少图书分类,此时需要去&&a href=&//link.zhihu.com/?target=http%3A//openisbn.com/& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&http://&/span&&span class=&visible&&openisbn.com/&/span&&span class=&invisible&&&/span&&/a&&网站根据ISBN号获取图书的分类信息。&p&如《失控》这本书, ISBN:
,对应url为 &&a href=&//link.zhihu.com/?target=http%3A//openisbn.com/isbn//& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&http://&/span&&span class=&visible&&openisbn.com/isbn/75133&/span&&span class=&invisible&&00712/&/span&&span class=&ellipsis&&&/span&&/a& & ,分析url规律就是以 &&a href=&//link.zhihu.com/?target=http%3A//openisbn.com/isbn//& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&http://openisbn.com/isbn/&/a&& 作为固定前缀然后拼接ISBN号得到;然后分析页面元素,Chrome右键 —& 检查:&/p&&figure&&img src=&https://pic4.zhimg.com/50/15bfb8a4a260de575d7d121e541dde09_b.jpg& data-rawwidth=&878& data-rawheight=&685& class=&origin_image zh-lightbox-thumb& width=&878& data-original=&https://pic4.zhimg.com/50/15bfb8a4a260de575d7d121e541dde09_r.jpg&&&/figure&&p&我先直接使用urllib2 + re 来获得“Category:” 信息:&/p&&div class=&highlight&&&pre&&code class=&language-python&&&span class=&c&&#-*- coding:UTF-8 -*-&/span&
&span class=&kn&&import&/span& &span class=&nn&&re&/span&
&span class=&kn&&import&/span& &span class=&nn&&urllib2&/span&
&span class=&n&&isbn&/span& &span class=&o&&=&/span& &span class=&s&&''&/span&
&span class=&n&&url&/span& &span class=&o&&=&/span& &span class=&s&&'http://openisbn.com/isbn/{0}/'&/span&&span class=&o&&.&/span&&span class=&n&&format&/span&&span class=&p&&(&/span&&span class=&n&&isbn&/span&&span class=&p&&)&/span&
&span class=&n&&category_pattern&/span& &span class=&o&&=&/span& &span class=&n&&re&/span&&span class=&o&&.&/span&&span class=&n&&compile&/span&&span class=&p&&(&/span&&span class=&s&&r'Category: *.*, '&/span&&span class=&p&&)&/span&
&span class=&n&&html&/span& &span class=&o&&=&/span& &span class=&n&&urllib2&/span&&span class=&o&&.&/span&&span class=&n&&urlopen&/span&&span class=&p&&(&/span&&span class=&n&&url&/span&&span class=&p&&)&/span&&span class=&o&&.&/span&&span class=&n&&read&/span&&span class=&p&&()&/span&
&span class=&n&&category_info&/span& &span class=&o&&=&/span& &span class=&n&&category_pattern&/span&&span class=&o&&.&/span&&span class=&n&&findall&/span&&span class=&p&&(&/span&&span class=&n&&html&/span&&span class=&p&&)&/span&
&span class=&k&&if&/span& &span class=&nb&&len&/span&&span class=&p&&(&/span&&span class=&n&&category_info&/span&&span class=&p&&)&/span& &span class=&o&&&&/span& &span class=&mi&&0&/span& &span class=&p&&:&/span&
&span class=&k&&print&/span& &span class=&n&&category_info&/span&&span class=&p&&[&/span&&span class=&mi&&0&/span&&span class=&p&&]&/span&
&span class=&k&&else&/span&&span class=&p&&:&/span&
&span class=&k&&print&/span& &span class=&s&&'get category failed.'&/span&
&/code&&/pre&&/div&&p&输出:&/p&&p&Category: 现当代小说, 小说, &/p&&br&&p&2.选择合适的定位元素:&/p&&p&由于页面中只有一行“Category:” 信息,正则表达式提取就行,如果有很多行的话就需要缩小查找范围了,BeautifulSoup库就可以用来定位查找范围。通过分析可知,包含所需“Category:” 最近一层的div 是 &div class=“PostContent”&,仔细观察,外层还有一个 &div class=“PostContent”&,而 &div class=“Post”& 也是一样,这样如果使用它们来定位范围的话,使用find方法返回的tag对象是最先找到的外层div,范围不够小;使用findAll,返回的tag对象列表还需要遍历,综合得出用&div class=“Article”& 作为定位元素,find方法定位返回的范围够小,又不需要对find结果进行遍历。&/p&&p&使用urllib2 + Beautiful Soup 3
+ re 再来提取一次 (Beautiful Soup最新版本为4.4,兼容python3和python2,BS4跟BS3在导包方式上有点差别):&/p&&div class=&highlight&&&pre&&code class=&language-python&&&span class=&c&&#-*- coding:UTF-8 -*-&/span&
&span class=&kn&&import&/span& &span class=&nn&&re&/span&
&span class=&kn&&import&/span& &span class=&nn&&urllib2&/span&
&span class=&kn&&from&/span& &span class=&nn&&BeautifulSoup&/span& &span class=&kn&&import&/span& &span class=&n&&BeautifulSoup&/span&
&span class=&n&&isbn&/span& &span class=&o&&=&/span& &span class=&s&&''&/span&
&span class=&n&&url&/span& &span class=&o&&=&/span& &span class=&s&&'http://openisbn.com/isbn/{0}/'&/span&&span class=&o&&.&/span&&span class=&n&&format&/span&&span class=&p&&(&/span&&span class=&n&&isbn&/span&&span class=&p&&)&/span&
&span class=&n&&category_pattern&/span& &span class=&o&&=&/span& &span class=&n&&re&/span&&span class=&o&&.&/span&&span class=&n&&compile&/span&&span class=&p&&(&/span&&span class=&s&&r'Category: *.*, '&/span&&span class=&p&&)&/span&
&span class=&n&&html&/span& &span class=&o&&=&/span& &span class=&n&&urllib2&/span&&span class=&o&&.&/span&&span class=&n&&urlopen&/span&&span class=&p&&(&/span&&span class=&n&&url&/span&&span class=&p&&)&/span&&span class=&o&&.&/span&&span class=&n&&read&/span&&span class=&p&&()&/span&
&span class=&n&&soup&/span& &span class=&o&&=&/span& &span class=&n&&BeautifulSoup&/span&&span class=&p&&(&/span&&span class=&n&&html&/span&&span class=&p&&)&/span&
&span class=&n&&div_tag&/span& &span class=&o&&=&/span& &span class=&n&&soup&/span&&span class=&o&&.&/span&&span class=&n&&find&/span&&span class=&p&&(&/span&&span class=&s&&'div'&/span&&span class=&p&&,{&/span&&span class=&s&&'class'&/span&&span class=&p&&:&/span&&span class=&s&&'Article'&/span&&span class=&p&&})&/span&
&span class=&n&&category_info&/span& &span class=&o&&=&/span& &span class=&n&&category_pattern&/span&&span class=&o&&.&/span&&span class=&n&&findall&/span&&span class=&p&&(&/span&&span class=&nb&&str&/span&&span class=&p&&(&/span&&span class=&n&&div_tag&/span&&span class=&p&&))&/span&
&span class=&k&&if&/span& &span class=&nb&&len&/span&&span class=&p&&(&/span&&span class=&n&&category_info&/span&&span class=&p&&)&/span& &span class=&o&&&&/span& &span class=&mi&&0&/span& &span class=&p&&:&/span&
&span class=&k&&print&/span& &span class=&n&&category_info&/span&&span class=&p&&[&/span&&span class=&mi&&0&/span&&span class=&p&&]&/span&
&span class=&k&&else&/span&&span class=&p&&:&/span&
&span class=&k&&print&/span& &span class=&s&&'get category failed.'&/span&
&/code&&/pre&&/div&&p&输出:&/p&&p&Category: 现当代小说, 小说, &/p&&br&&p&3. 抓取js渲染的内容:&/p&&p&用baidu搜索日历,获取结果页中的节假日日期&/p&&figure&&img src=&https://pic3.zhimg.com/50/a66b297d43a40abaa6bcf4a6ba6f232c_b.jpg& data-rawwidth=&882& data-rawheight=&579& class=&origin_image zh-lightbox-thumb& width=&882& data-original=&https://pic3.zhimg.com/50/a66b297d43a40abaa6bcf4a6ba6f232c_r.jpg&&&/figure&&p&像上次一样直接使用urllib打开网页,发现返回的html中并没有期望得到的内容,原因是我通过浏览器所看到的页面内容实际是等js渲染完成后最终展现的,中间还包含了多次的ajax请求,这样使用urllib一次就不能胜任了,此时就可以让selenium上场了(webdriver用的phantomjs,需要提前下载phantomjs放到当前的PATH路径下),由于要查找的标识 &div class=“op-calendar-new-relative”& 包含了多个,所以这次使用的方法是findAll,然后再对返回的结果进行遍历,解析每个tag对象的a属性,如果包含了“休”字标识,那么这一天就是节假日。&/p&&div class=&highlight&&&pre&&code class=&language-text&&# -*- coding:UTF-8 -*-
import urllib
from selenium import webdriver
from BeautifulSoup import BeautifulSoup
holiday_list = []
url = 'http://www.baidu.com/s?' + urllib.urlencode({'wd': '日历'})
date_pattern = re.compile(r'date=&[\d]+[-][\d]+[-][\d]+&')
driver = webdriver.PhantomJS()
driver.get(url)
html = driver.page_source
driver.quit()
soup = BeautifulSoup(html)
td_div_list = soup.findAll('div',{'class':'op-calendar-new-relative'})
for td_tag in td_div_list:
href_tag = str(td_tag.a)
if href_tag.find('休') != -1:
holiday_date_list = date_pattern.findall(href_tag)
if len(holiday_date_list) & 0:
holiday_list.append(holiday_date_list[0].split('&')[1])
print holiday_list
&/code&&/pre&&/div&&p&输出:&/p&&p&['', '', '', '', '’]&/p&&br&&p&4. 设置代理,抓取google play排行榜(&/p&&p&selenium不仅可以很好的模拟浏览器行为,还可以将网页内容截图保存)&/p&&div class=&highlight&&&pre&&code class=&language-text&&# -*- coding:UTF-8 -*-
from selenium import webdriver
url = 'https://play.google.com/store/apps/top?hl=zh_CN'
proxy_setting = ['--proxy=127.0.0.1:10800', '--proxy-type=socks5']
driver = webdriver.PhantomJS(service_args=proxy_setting)
driver.get(url)
driver.maximize_window()
# driver.implicitly_wait(10)
top_group_list = driver.find_elements_by_css_selector('.id-cluster-container.cluster-container.cards-transition-enabled')
driver.get_screenshot_as_file('top.jpg’)
for top_group in top_group_list:
group_name = top_group.find_element_by_xpath('div/div[@class=&cluster-heading&]/h2/a').text
for item in top_group.find_elements_by_class_name('title'):
print u'bound: {0} app: {1}'.format(group_name,item.text)
driver.quit()
&/code&&/pre&&/div&&br&&p&5. 爬取全站&/p&&p&爬取一号店手机类目下的手机型号、价格。通过商品分类获取起始url: &&a href=&//link.zhihu.com/?target=http%3A//list.yhd.com/cFtp%3D15..0.3.LFfY%25607f-10-Enf8s%26ti%3D1FM9& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&http://&/span&&span class=&visible&&list.yhd.com/c23586-0/?&/span&&span class=&invisible&&tp=15..0.3.LFfY%607f-10-Enf8s&ti=1FM9&/span&&span class=&ellipsis&&&/span&&/a&& ,从起始页中获取每个商品的url,继而抓取类目下的所有单品。爬取全站,网页较多,涉及到并发问题,此时可以使用Scrapy框架了。&/p&&p&该类目下总共50页,拼接上页码,url格式如下:&/p&&figure&&img src=&https://pic4.zhimg.com/50/67d166dfe52b72e40548ab_b.jpg& data-rawwidth=&982& data-rawheight=&114& class=&origin_image zh-lightbox-thumb& width=&982& data-original=&https://pic4.zhimg.com/50/67d166dfe52b72e40548ab_r.jpg&&&/figure&&p&因为页码是用 # 拼接,会被Scrapy认为是同一个url,所以只会处理一次。&/p&&p&使用浏览器访问url后观察,页面被重定向,真正的url格式却是这样的:&/p&&figure&&img src=&https://pic3.zhimg.com/50/ee1cfffa0fffeb4e6a2c1d9_b.jpg& data-rawwidth=&818& data-rawheight=&100& class=&origin_image zh-lightbox-thumb& width=&818& data-original=&https://pic3.zhimg.com/50/ee1cfffa0fffeb4e6a2c1d9_r.jpg&&&/figure&&p&我们可以使用重定向后的url作为种子url进行爬取,Scrapy代码如下:&/p&&div class=&highlight&&&pre&&code class=&language-python3&&&span class=&c&&# -*- coding:UTF-8 -*-&/span&
&span class=&kn&&import&/span& &span class=&nn&&re&/span&
&span class=&kn&&import&/span& &span class=&nn&&time&/span&
&span class=&kn&&from&/span& &span class=&nn&&scrapy&/span& &span class=&k&&import&/span& &span class=&n&&Spider&/span&&span class=&p&&,&/span& &span class=&n&&Request&/span&
&span class=&c&&# from selenium import webdriver&/span&
&span class=&c&&# from selenium.webdriver.common.action_chains import ActionChains&/span&
&span class=&k&&class&/span& &span class=&nc&&YhdMobileSpider&/span&&span class=&p&&(&/span&&span class=&n&&Spider&/span&&span class=&p&&):&/span&
&span class=&n&&name&/span& &span class=&o&&=&/span& &span class=&s&&'yhd_mobile'&/span&
&span class=&n&&start_urls&/span& &span class=&o&&=&/span& &span class=&p&&[&/span&&span class=&s&&'http://list.yhd.com/c36/b/a-s1-v4-p1-price-d0-f0d-m1-rt0-pid-mid0-k/'&/span&&span class=&p&&]&/span&
&span class=&k&&def&/span& &span class=&nf&&parse&/span&&span class=&p&&(&/span&&span class=&bp&&self&/span&&span class=&p&&,&/span& &span class=&n&&response&/span&&span class=&p&&):&/span&
&span class=&sd&&'''&/span&
&span class=&sd&&
@param response:&/span&
&span class=&sd&&
@return: item list&/span&
&span class=&sd&&
'''&/span&
&span class=&n&&page_number&/span& &span class=&o&&=&/span& &span class=&bp&&self&/span&&span class=&o&&.&/span&&span class=&n&&get_page_count&/span&&span class=&p&&(&/span&&span class=&n&&response&/span&&span class=&p&&)&/span&
&span class=&n&&page_url_list&/span& &span class=&o&&=&/span& &span class=&p&&[&/span& &span class=&n&&re&/span&&span class=&o&&.&/span&&span class=&n&&sub&/span&&span class=&p&&(&/span&&span class=&s&&r'-p[\d]+-'&/span&&span class=&p&&,&/span& &span class=&s&&'-p{0}-'&/span&&span class=&o&&.&/span&&span class=&n&&format&/span&&span class=&p&&(&/span&&span class=&n&&page&/span&&span class=&p&&),&/span& &span class=&n&&response&/span&&span class=&o&&.&/span&&span class=&n&&url&/span&&span class=&p&&)&/span& &span class=&k&&for&/span& &span class=&n&&page&/span& &span class=&ow&&in&/span& &span class=&n&&xrange&/span&&span class=&p&&(&/span&&span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&n&&page_number&/span&&span class=&o&&+&/span&&span class=&mi&&1&/span&&span class=&p&&)&/span& &span class=&p&&]&/span&
&span class=&k&&return&/span& &span class=&nb&&map&/span&&span class=&p&&(&/span&&span class=&k&&lambda&/span& &span class=&n&&url&/span&&span class=&p&&:&/span& &span class=&n&&Request&/span&&span class=&p&&(&/span&&span class=&n&&url&/span&&span class=&p&&,&/span& &span class=&n&&callback&/span&&span class=&o&&=&/span&&span class=&bp&&self&/span&&span class=&o&&.&/span&&span class=&n&&parse_product_page&/span&&span class=&p&&),&/span& &span class=&n&&page_url_list&/span&&span class=&p&&)&/span&
&span class=&k&&def&/span& &span class=&nf&&parse_product_page&/span&&span class=&p&&(&/span&&span class=&bp&&self&/span&&span class=&p&&,&/span& &span class=&n&&response&/span&&span class=&p&&):&/span&
&span class=&n&&product_url_list&/span& &span class=&o&&=&/span& &span class=&p&&[]&/span&
&span class=&k&&for&/span& &span class=&n&&product_address&/span& &span class=&ow&&in&/span& &span class=&n&&response&/span&&span class=&o&&.&/span&&span class=&n&&xpath&/span&&span class=&p&&(&/span&&span class=&s&&'//div[@id=&itemSearchList&]/div/div[@class=&itemBox&]/p[@class=&proName clearfix&]/a[1]/@href'&/span&&span class=&p&&):&/span&
&span class=&n&&href&/span& &span class=&o&&=&/span& &span class=&n&&product_address&/span&&span class=&o&&.&/span&&span class=&n&&extract&/span&&span class=&p&&()&/span&
&span class=&n&&product_url_list&/span&&span class=&o&&.&/span&&span class=&n&&append&/span&&span class=&p&&(&/span&&span class=&n&&href&/span&&span class=&p&&)&/span&
&span class=&n&&item_list&/span& &span class=&o&&=&/span& &span class=&nb&&map&/span&&span class=&p&&(&/span&&span class=&k&&lambda&/span& &span class=&n&&url&/span&&span class=&p&&:&/span& &span class=&n&&Request&/span&&span class=&p&&(&/span&&span class=&n&&url&/span&&span class=&p&&,&/span& &span class=&n&&callback&/span&&span class=&o&&=&/span&&span class=&bp&&self&/span&&span class=&o&&.&/span&&span class=&n&&parse_item&/span&&span class=&p&&),&/span& &span class=&n&&product_url_list&/span&&span class=&p&&)&/span&
&span class=&k&&return&/span& &span class=&n&&item_list&/span&
&span class=&k&&def&/span& &span class=&nf&&parse_item&/span&&span class=&p&&(&/span&&span class=&bp&&self&/span&&span class=&p&&,&/span& &span class=&n&&response&/span&&span class=&p&&):&/span&
&span class=&sd&&'''&/span&
&span class=&sd&&
根据单品链接抓取单品的属性&/span&
&span class=&sd&&
@param response:&/span&
&span class=&sd&&
@return: item&/span&
&span class=&sd&&
'''&/span&
&span class=&c&&#商品详情地址&/span&
&span class=&n&&url&/span& &span class=&o&&=&/span& &span class=&n&&response&/span&&span class=&o&&.&/span&&span class=&n&&url&/span&
&span class=&c&&#品牌&/span&
&span class=&n&&brand&/span& &span class=&o&&=&/span& &span class=&n&&response&/span&&span class=&o&&.&/span&&span class=&n&&xpath&/span&&span class=&p&&(&/span&&span class=&s&&'//div[@class=&crumb clearfix&]/a[@id=&brand_relevance&]/text()'&/span&&span class=&p&&)&/span&&span class=&o&&.&/span&&span class=&n&&extract&/span&&span class=&p&&()[&/span&&span class=&mi&&0&/span&&span class=&p&&]&/span&
&span class=&c&&#商品名称&/span&
&span class=&n&&spu_name&/span& &span class=&o&&=&/span& &span class=&n&&response&/span&&span class=&o&&.&/span&&span class=&n&&xpath&/span&&span class=&p&&(&/span&&span class=&s&&'//div[@class=&crumb clearfix&]/span/text()'&/span&&span class=&p&&)&/span&&span class=&o&&.&/span&&span class=&n&&extract&/span&&span class=&p&&()[&/span&&span class=&mi&&0&/span&&span class=&p&&]&/span&
&span class=&nb&&print&/span& &span class=&n&&url&/span&&span class=&p&&,&/span&&span class=&n&&brand&/span&&span class=&p&&,&/span&&span class=&n&&spu_name&/span&
&span class=&k&&def&/span& &span class=&nf&&get_page_count&/span&&span class=&p&&(&/span&&span class=&bp&&self&/span&&span class=&p&&,&/span&&span class=&n&&response&/span&&span class=&p&&):&/span&
&span class=&n&&page_count&/span& &span class=&o&&=&/span& &span class=&n&&response&/span&&span class=&o&&.&/span&&span class=&n&&xpath&/span&&span class=&p&&(&/span&&span class=&s&&'//input[@id=&pageCountPage&]/@value'&/span&&span class=&p&&)&/span&&span class=&o&&.&/span&&span class=&n&&extract&/span&&span class=&p&&()&/span&
&span class=&k&&if&/span& &span class=&n&&page_count&/span&&span class=&p&&:&/span&
&span class=&n&&page_count&/span& &span class=&o&&=&/span& &span class=&nb&&int&/span&&span class=&p&&(&/span&&span class=&n&&page_count&/span&&span class=&p&&[&/span&&span class=&mi&&0&/span&&span class=&p&&])&/span&
&span class=&k&&else&/span&&span class=&p&&:&/span&
&span class=&n&&page_count&/span& &span class=&o&&=&/span& &span class=&mi&&1&/span&
&span class=&k&&return&/span& &span class=&n&&page_count&/span&
&/code&&/pre&&/div&&br&&p&几个例子都是之前工作的真实需求做的一些简化,后续再做适当变换扩展就可以进行复杂的爬取了,至于爬取之后的数据如何存储,做什么用,那又是另外一个话题了。&/p&&p&(妈的,编辑好多遍知乎就是不能好好的显示 url 原地址,真是日了狗了,算了 将就着看吧)&/p&&br&&p&——————————————分割线——————————————&/p&&p&刚看到一篇专栏,资源挺全的,新手要学习 Python 的可以关注一下:&a href=&https://zhuanlan.zhihu.com/p/& class=&internal&&如何学习Python爬虫[入门篇]? - 学习编程 - 知乎专栏&/a&&/p&
谢邀,这几天实在太忙,没时间写技术答,偷懒将之前写的一篇爬虫入门贴过来吧。所贴代码皆为当时运行通过的,应该是最入门的历程了,是不是干货题主自行判断 ---------------------------- 作者:洪宸 链接: 来源:知乎…
如今有一些金融的创业公司,内部有着庞大的金融数据库,开放一些API供量化研究的人使用。比如JoinQuant、优矿等等。下面我就示范一下如何获取平安银行2015年以来的历史交易数据。&br&以优矿uqer.io+Python为例:&br&首先我们登录官网,打开研究数据文档查看有哪些数据&a href=&//link.zhihu.com/?target=https%3A//uqer.io/data/browse/0/%3Fpage%3D1& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&https://&/span&&span class=&visible&&uqer.io/data/browse/0/?&/span&&span class=&invisible&&page=1&/span&&span class=&ellipsis&&&/span&&/a&&br&&figure&&img src=&https://pic4.zhimg.com/50/b66eb8fae96efd2d91dd243_b.jpg& data-rawwidth=&2092& data-rawheight=&1296& class=&origin_image zh-lightbox-thumb& width=&2092& data-original=&https://pic4.zhimg.com/50/b66eb8fae96efd2d91dd243_r.jpg&&&/figure&&br&题主要求股票的历史数据,我还是假定你需要的是历史价格数据,其他数据获取方法是一样的。&br&&br&我们打开历史价格数据的API说明文档:&br&&figure&&img src=&https://pic3.zhimg.com/50/e7bda57df_b.jpg& data-rawwidth=&1812& data-rawheight=&1218& class=&origin_image zh-lightbox-thumb& width=&1812& data-original=&https://pic3.zhimg.com/50/e7bda57df_r.jpg&&&/figure&根据文档,需要填写的最主要的信息就是股票代码,起止时间,在此我们修改API如下:&br&DataAPI.MktEqudGet(tradeDate=u&&,secID=u&&,ticker=u&000001&,beginDate=u&&,endDate=u&&,isOpen=&&,field=u&&,pandas=&1&)&br&新建Notebook,输入修改后的API代码,ctrl+enter就可以查看到数据。&br&&figure&&img src=&https://pic1.zhimg.com/50/a55caa9d7aa7ee2325b1cd_b.jpg& data-rawwidth=&2088& data-rawheight=&1272& class=&origin_image zh-lightbox-thumb& width=&2088& data-original=&https://pic1.zhimg.com/50/a55caa9d7aa7ee2325b1cd_r.jpg&&&/figure&&br&如果你想下载数据采用这样的代码:&br&df = DataAPI......(输入api)&br&df.to_csv(&price.csv&,encoding=&gbk&)&br&这行代码会把获取数据的结果保存在df中,df.to_csv(“price.csv”,encoding=&gbk)会把df文件放到price.csv文件里面,按照中文gbk编码&br&&figure&&img src=&https://pic3.zhimg.com/50/719ca2eb913febb_b.jpg& data-rawwidth=&2004& data-rawheight=&1222& class=&origin_image zh-lightbox-thumb& width=&2004& data-original=&https://pic3.zhimg.com/50/719ca2eb913febb_r.jpg&&&/figure&&br&点击侧边栏Data就可以看见了.&br&&figure&&img src=&https://pic4.zhimg.com/50/d9f918e5129846dce9a61_b.jpg& data-rawwidth=&1026& data-rawheight=&692& class=&origin_image zh-lightbox-thumb& width=&1026& data-original=&https://pic4.zhimg.com/50/d9f918e5129846dce9a61_r.jpg&&&/figure&&br&&figure&&img src=&https://pic2.zhimg.com/50/57de87e0f6de1f3b499e83_b.jpg& data-rawwidth=&2356& data-rawheight=&1478& class=&origin_image zh-lightbox-thumb& width=&2356& data-original=&https://pic2.zhimg.com/50/57de87e0f6de1f3b499e83_r.jpg&&&/figure&最后还是想安利一下大家多学Python益处多,我学Python的时候就是在优矿上写一些量化策略,配合着liaoxuefeng的Python中文文档和Python官方文档,自己构建一些指标,这样对Python处理数据的能力会有很大提高。大家可以搜一些金融工程研报,复制一下里面的指标,基本处理过六七份金融工程研报,数据处理的能力就会有很明显的提高了。&br&&figure&&img src=&https://pic2.zhimg.com/50/ad18953dc89dffa16a82dd3_b.jpg& data-rawwidth=&1724& data-rawheight=&1400& class=&origin_image zh-lightbox-thumb& width=&1724& data-original=&https://pic2.zhimg.com/50/ad18953dc89dffa16a82dd3_r.jpg&&&/figure&&figure&&img src=&https://pic3.zhimg.com/50/539b90bb4b56dddbaa3b4598_b.jpg& data-rawwidth=&2406& data-rawheight=&1212& class=&origin_image zh-lightbox-thumb& width=&2406& data-original=&https://pic3.zhimg.com/50/539b90bb4b56dddbaa3b4598_r.jpg&&&/figure&亲身体验,学金融一定要多学计算机,多学数学,多实践,多思考。&br&祝各位早日学好Python,轻松搞定各种数据!&br&----------------------------------------------------更新-----------------------------------------------------&br&另外,我想再介绍一下tushare财经数据包,tushare是一个财经数据包,可以点击官网&a href=&//link.zhihu.com/?target=http%3A//tushare.org/index.html& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&http://&/span&&span class=&visible&&tushare.org/index.html&/span&&span class=&invisible&&&/span&&/a&查看详细内容。主要采用Python去调用&br&&br&首先,我们安装了Python之后,再cmd里面输入pip install tushare,就已经完成导入了。&br&打开jupyter/pycharm/sublime.....,写一段代码,引入tushare包,调用数据参考官网的说明,我还保存到了新建的data.csv文件里面了&br&&blockquote&import pandas
as pd&br&import tushare as ts&br&df=ts.get_hist_data('600848') #一次性获取全部日k线数据&br&df.to_csv('data.csv')#导出数据&br&df#查看数据&/blockquote&(tushare提供不仅仅历史数据,还有新闻,宏观经济,行业分类,股指期货,基金等等甚至还有雪球社交统计)&br&特色大数据是通联数据提供的,具体API要到通联数据去查看如何调用。
如今有一些金融的创业公司,内部有着庞大的金融数据库,开放一些API供量化研究的人使用。比如JoinQuant、优矿等等。下面我就示范一下如何获取平安银行2015年以来的历史交易数据。 以优矿uqer.io+Python为例: 首先我们登录官网,打开研究数据文档查看有哪些…
Algo Trading的话,有UChicago Midwest Trading Competition,每年大概有十几个学校参加吧,主力是MIT/NorthWestern/UChi/CMU/我们学校...&br&&br&赛制是有三个case,提前写好strategy code,比赛当天就是大家飞来芝加哥喝酒聊天看着自己的程序run起来。&br&&br&&a href=&//link.zhihu.com/?target=https%3A//careeradvancement.uchicago.edu/uchicago-midwest-trading-competition& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&The UChicago Midwest Trading Competition&/a&&br&这个比赛的Sponsor以Quant Shop和HFT Shop为主,什么IMC啊,Optiver啊,DRW啊之类的,假如想进这些招聘比较低调的公司,是不错的Networking机会。
Algo Trading的话,有UChicago Midwest Trading Competition,每年大概有十几个学校参加吧,主力是MIT/NorthWestern/UChi/CMU/我们学校... 赛制是有三个case,提前写好strategy code,比赛当天就是大家飞来芝加哥喝酒聊天看着自己的程序run起来。
已有帐号?
无法登录?
社交帐号登录
1229 人关注
524 条内容
4054 人关注
2106 条内容
483 人关注
631 条内容
132 人关注
419 条内容}

我要回帖

更多关于 健身界有名的美女 的文章

更多推荐

版权声明:文章内容来源于网络,版权归原作者所有,如有侵权请点击这里与我们联系,我们将及时删除。

点击添加站长微信