红米note电脑任务栏跑上面去了里怎么有一排英文和数字 上面显示D3de什么的一大排

&p&&u&&b&可以画画啊!可以画画啊!可以画画啊!&/b&&/u& 对,有趣的事情需要讲三遍。
事情是这样的,通过python的深度学习算法包去训练计算机模仿世界名画的风格,然后应用到另一幅画中,不多说直接上图!&/p&&img src=&/98cab4f35d9e90b47dee_b.jpg& data-rawwidth=&468& data-rawheight=&600& class=&origin_image zh-lightbox-thumb& width=&468& data-original=&/98cab4f35d9e90b47dee_r.jpg&&&br&&p&这个是世界名画”&i&毕加索的自画像&/i&“(我也不懂什么是世界名画,但是我会google呀哈哈),以这张图片为模板,让计算机去学习这张图片的风格(至于怎么学习请参照这篇国外大牛的论文&a href=&///?target=http%3A//arxiv.org/abs/& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&http://&/span&&span class=&visible&&arxiv.org/abs/&/span&&span class=&invisible&&6&/span&&span class=&ellipsis&&&/span&&i class=&icon-external&&&/i&&/a&)应用到自己的这张图片上。&/p&&img src=&/7ca4eb6ca4bc1a9993ed35_b.jpg& data-rawwidth=&249& data-rawheight=&365& class=&content_image& width=&249&&&p&结果就变成下面这个样子了&/p&&img src=&/b869febc05752efd02bc74_b.png& data-rawwidth=&472& data-rawheight=&660& class=&origin_image zh-lightbox-thumb& width=&472& data-original=&/b869febc05752efd02bc74_r.png&&&br&&p&咦,吓死宝宝了,不过好玩的东西当然要身先士卒啦!
接着由于距离开学也越来越近了,为了给广大新生营造一个良好的校园,噗!为了美化校园在新生心目中的形象学长真的不是有意要欺骗你们的。特意制作了下面的《梵高笔下的东华理工大学》,是不是没有听说过这个大学,的确她就是一个普通的二本学校不过这都不是重点。
左边的图片是梵高的《星空》作为模板,中间的图片是待转化的图片,右边的图片是结果&/p&&img src=&/16b2c1522cea0ae43c4d2c1cc6871b29_b.png& data-rawwidth=&852& data-rawheight=&172& class=&origin_image zh-lightbox-thumb& width=&852& data-original=&/16b2c1522cea0ae43c4d2c1cc6871b29_r.png&&&p&这是我们学校的内“湖”(池塘)&/p&&img src=&/14fb75b34_b.png& data-rawwidth=&856& data-rawheight=&173& class=&origin_image zh-lightbox-thumb& width=&856& data-original=&/14fb75b34_r.png&&&p&校园里的樱花广场(个人觉得这是我校最浪漫的地方了)&/p&&img src=&/dd2f5df802_b.png& data-rawwidth=&848& data-rawheight=&206& class=&origin_image zh-lightbox-thumb& width=&848& data-original=&/dd2f5df802_r.png&&&p&不多说,学校图书馆&/p&&img src=&/fdb0d_b.png& data-rawwidth=&851& data-rawheight=&194& class=&origin_image zh-lightbox-thumb& width=&851& data-original=&/fdb0d_r.png&&&p&“池塘”边的柳树&/p&&img src=&/8d8dcbca85b62fd237d4d6fc_b.png& data-rawwidth=&852& data-rawheight=&204& class=&origin_image zh-lightbox-thumb& width=&852& data-original=&/8d8dcbca85b62fd237d4d6fc_r.png&&&p&学校东大门&/p&&img src=&/add0ee7c8_b.png& data-rawwidth=&862& data-rawheight=&164& class=&origin_image zh-lightbox-thumb& width=&862& data-original=&/add0ee7c8_r.png&&&p&学校测绘楼&/p&&img src=&/cb6dbb3e_b.png& data-rawwidth=&852& data-rawheight=&211& class=&origin_image zh-lightbox-thumb& width=&852& data-original=&/cb6dbb3e_r.png&&&p&学校地学楼&/p&&p&为了便于观看,附上生成后的大图:&/p&&img src=&/d7ee30ae90b3d81049abee478e8e75b6_b.png& data-rawwidth=&699& data-rawheight=&408& class=&origin_image zh-lightbox-thumb& width=&699& data-original=&/d7ee30ae90b3d81049abee478e8e75b6_r.png&&&br&&img src=&/c04dbd2a0d3db26d0916206_b.png& data-rawwidth=&700& data-rawheight=&465& class=&origin_image zh-lightbox-thumb& width=&700& data-original=&/c04dbd2a0d3db26d0916206_r.png&&&br&&img src=&/f41d8d145b8a8fe30abba37ede85c778_b.png& data-rawwidth=&695& data-rawheight=&370& class=&origin_image zh-lightbox-thumb& width=&695& data-original=&/f41d8d145b8a8fe30abba37ede85c778_r.png&&&br&&img src=&/797a428a10c9a6278874_b.png& data-rawwidth=&700& data-rawheight=&447& class=&origin_image zh-lightbox-thumb& width=&700& data-original=&/797a428a10c9a6278874_r.png&&&br&&img src=&/d4e2accd9af15bc_b.png& data-rawwidth=&699& data-rawheight=&469& class=&origin_image zh-lightbox-thumb& width=&699& data-original=&/d4e2accd9af15bc_r.png&&&br&&img src=&/7ea97eab41a2c2eb32d356b1afd9ca2f_b.png& data-rawwidth=&698& data-rawheight=&409& class=&origin_image zh-lightbox-thumb& width=&698& data-original=&/7ea97eab41a2c2eb32d356b1afd9ca2f_r.png&&&br&&img src=&/5f1fb9ec9e0d6e66b76a2cc_b.png& data-rawwidth=&699& data-rawheight=&469& class=&origin_image zh-lightbox-thumb& width=&699& data-original=&/5f1fb9ec9e0d6e66b76a2cc_r.png&&&br&&p&别看才区区七张图片,可是这让计算机运行了好长的时间,期间电脑死机两次!&/p&&p&好了广告打完了,下面是福利时间&/p&&h2&&b&在本地用keras搭建风格转移平台&/b&&/h2&&h2&&b&1.相关依赖库的安装&/b&&/h2&&div class=&highlight&&&pre&&code class=&language-text&&# 命令行安装keras、h5py、tensorflow
pip3 install keras
pip3 install h5py
pip3 install tensorflow
&/code&&/pre&&/div&&p&如果tensorflowan命令行安装失败,可以在这里下载whl包&a href=&///?target=http%3A//www.lfd.uci.edu/%7Egohlke/pythonlibs/& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&Python Extension Packages for Windows&i class=&icon-external&&&/i&&/a&&a href=&///?target=http%3A//www.lfd.uci.edu/%7Egohlke/pythonlibs/& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&(进入网址后ctrl+F输入tensorflow可以快速搜索)&i class=&icon-external&&&/i&&/a&&/p&&h2&&b&2.配置运行环境&/b&&/h2&&p&&b&下载VGG16模型 &/b&&a href=&///?target=https%3A///s/1i5wYN1z& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&https://&/span&&span class=&visible&&/s/1i5wYN1&/span&&span class=&invisible&&z&/span&&span class=&ellipsis&&&/span&&i class=&icon-external&&&/i&&/a& 放入如下目录当中&/p&&img src=&/v2-b06fa971e3b6ebcbe3ce39c_b.png& data-rawwidth=&654& data-rawheight=&129& class=&origin_image zh-lightbox-thumb& width=&654& data-original=&/v2-b06fa971e3b6ebcbe3ce39c_r.png&&&h2&&b&3.代码编写&/b&&/h2&&div class=&highlight&&&pre&&code class=&language-python3&&&span class=&kn&&from&/span& &span class=&nn&&__future__&/span& &span class=&k&&import&/span& &span class=&n&&print_function&/span&
&span class=&kn&&from&/span& &span class=&nn&&keras.preprocessing.image&/span& &span class=&k&&import&/span& &span class=&n&&load_img&/span&&span class=&p&&,&/span& &span class=&n&&img_to_array&/span&
&span class=&kn&&from&/span& &span class=&nn&&scipy.misc&/span& &span class=&k&&import&/span& &span class=&n&&imsave&/span&
&span class=&kn&&import&/span& &span class=&nn&&numpy&/span& &span class=&k&&as&/span& &span class=&nn&&np&/span&
&span class=&kn&&from&/span& &span class=&nn&&scipy.optimize&/span& &span class=&k&&import&/span& &span class=&n&&fmin_l_bfgs_b&/span&
&span class=&kn&&import&/span& &span class=&nn&&time&/span&
&span class=&kn&&import&/span& &span class=&nn&&argparse&/span&
&span class=&kn&&from&/span& &span class=&nn&&keras.applications&/span& &span class=&k&&import&/span& &span class=&n&&vgg16&/span&
&span class=&kn&&from&/span& &span class=&nn&&keras&/span& &span class=&k&&import&/span& &span class=&n&&backend&/span& &span class=&k&&as&/span& &span class=&n&&K&/span&
&span class=&n&&parser&/span& &span class=&o&&=&/span& &span class=&n&&argparse&/span&&span class=&o&&.&/span&&span class=&n&&ArgumentParser&/span&&span class=&p&&(&/span&&span class=&n&&description&/span&&span class=&o&&=&/span&&span class=&s&&'Neural style transfer with Keras.'&/span&&span class=&p&&)&/span&
&span class=&n&&parser&/span&&span class=&o&&.&/span&&span class=&n&&add_argument&/span&&span class=&p&&(&/span&&span class=&s&&'base_image_path'&/span&&span class=&p&&,&/span& &span class=&n&&metavar&/span&&span class=&o&&=&/span&&span class=&s&&'base'&/span&&span class=&p&&,&/span& &span class=&nb&&type&/span&&span class=&o&&=&/span&&span class=&nb&&str&/span&&span class=&p&&,&/span&
&span class=&n&&help&/span&&span class=&o&&=&/span&&span class=&s&&'Path to the image to transform.'&/span&&span class=&p&&)&/span&
&span class=&n&&parser&/span&&span class=&o&&.&/span&&span class=&n&&add_argument&/span&&span class=&p&&(&/span&&span class=&s&&'style_reference_image_path'&/span&&span class=&p&&,&/span& &span class=&n&&metavar&/span&&span class=&o&&=&/span&&span class=&s&&'ref'&/span&&span class=&p&&,&/span& &span class=&nb&&type&/span&&span class=&o&&=&/span&&span class=&nb&&str&/span&&span class=&p&&,&/span&
&span class=&n&&help&/span&&span class=&o&&=&/span&&span class=&s&&'Path to the style reference image.'&/span&&span class=&p&&)&/span&
&span class=&n&&parser&/span&&span class=&o&&.&/span&&span class=&n&&add_argument&/span&&span class=&p&&(&/span&&span class=&s&&'result_prefix'&/span&&span class=&p&&,&/span& &span class=&n&&metavar&/span&&span class=&o&&=&/span&&span class=&s&&'res_prefix'&/span&&span class=&p&&,&/span& &span class=&nb&&type&/span&&span class=&o&&=&/span&&span class=&nb&&str&/span&&span class=&p&&,&/span&
&span class=&n&&help&/span&&span class=&o&&=&/span&&span class=&s&&'Prefix for the saved results.'&/span&&span class=&p&&)&/span&
&span class=&n&&parser&/span&&span class=&o&&.&/span&&span class=&n&&add_argument&/span&&span class=&p&&(&/span&&span class=&s&&'--iter'&/span&&span class=&p&&,&/span& &span class=&nb&&type&/span&&span class=&o&&=&/span&&span class=&nb&&int&/span&&span class=&p&&,&/span& &span class=&n&&default&/span&&span class=&o&&=&/span&&span class=&mi&&10&/span&&span class=&p&&,&/span& &span class=&n&&required&/span&&span class=&o&&=&/span&&span class=&k&&False&/span&&span class=&p&&,&/span&
&span class=&n&&help&/span&&span class=&o&&=&/span&&span class=&s&&'Number of iterations to run.'&/span&&span class=&p&&)&/span&
&span class=&n&&parser&/span&&span class=&o&&.&/span&&span class=&n&&add_argument&/span&&span class=&p&&(&/span&&span class=&s&&'--content_weight'&/span&&span class=&p&&,&/span& &span class=&nb&&type&/span&&span class=&o&&=&/span&&span class=&nb&&float&/span&&span class=&p&&,&/span& &span class=&n&&default&/span&&span class=&o&&=&/span&&span class=&mf&&0.025&/span&&span class=&p&&,&/span& &span class=&n&&required&/span&&span class=&o&&=&/span&&span class=&k&&False&/span&&span class=&p&&,&/span&
&span class=&n&&help&/span&&span class=&o&&=&/span&&span class=&s&&'Content weight.'&/span&&span class=&p&&)&/span&
&span class=&n&&parser&/span&&span class=&o&&.&/span&&span class=&n&&add_argument&/span&&span class=&p&&(&/span&&span class=&s&&'--style_weight'&/span&&span class=&p&&,&/span& &span class=&nb&&type&/span&&span class=&o&&=&/span&&span class=&nb&&float&/span&&span class=&p&&,&/span& &span class=&n&&default&/span&&span class=&o&&=&/span&&span class=&mf&&1.0&/span&&span class=&p&&,&/span& &span class=&n&&required&/span&&span class=&o&&=&/span&&span class=&k&&False&/span&&span class=&p&&,&/span&
&span class=&n&&help&/span&&span class=&o&&=&/span&&span class=&s&&'Style weight.'&/span&&span class=&p&&)&/span&
&span class=&n&&parser&/span&&span class=&o&&.&/span&&span class=&n&&add_argument&/span&&span class=&p&&(&/span&&span class=&s&&'--tv_weight'&/span&&span class=&p&&,&/span& &span class=&nb&&type&/span&&span class=&o&&=&/span&&span class=&nb&&float&/span&&span class=&p&&,&/span& &span class=&n&&default&/span&&span class=&o&&=&/span&&span class=&mf&&1.0&/span&&span class=&p&&,&/span& &span class=&n&&required&/span&&span class=&o&&=&/span&&span class=&k&&False&/span&&span class=&p&&,&/span&
&span class=&n&&help&/span&&span class=&o&&=&/span&&span class=&s&&'Total Variation weight.'&/span&&span class=&p&&)&/span&
&span class=&n&&args&/span& &span class=&o&&=&/span& &span class=&n&&parser&/span&&span class=&o&&.&/span&&span class=&n&&parse_args&/span&&span class=&p&&()&/span&
&span class=&n&&base_image_path&/span& &span class=&o&&=&/span& &span class=&n&&args&/span&&span class=&o&&.&/span&&span class=&n&&base_image_path&/span&
&span class=&n&&style_reference_image_path&/span& &span class=&o&&=&/span& &span class=&n&&args&/span&&span class=&o&&.&/span&&span class=&n&&style_reference_image_path&/span&
&span class=&n&&result_prefix&/span& &span class=&o&&=&/span& &span class=&n&&args&/span&&span class=&o&&.&/span&&span class=&n&&result_prefix&/span&
&span class=&n&&iterations&/span& &span class=&o&&=&/span& &span class=&n&&args&/span&&span class=&o&&.&/span&&span class=&n&&iter&/span&
&span class=&c&&# these are the weights of the different loss components&/span&
&span class=&n&&total_variation_weight&/span& &span class=&o&&=&/span& &span class=&n&&args&/span&&span class=&o&&.&/span&&span class=&n&&tv_weight&/span&
&span class=&n&&style_weight&/span& &span class=&o&&=&/span& &span class=&n&&args&/span&&span class=&o&&.&/span&&span class=&n&&style_weight&/span&
&span class=&n&&content_weight&/span& &span class=&o&&=&/span& &span class=&n&&args&/span&&span class=&o&&.&/span&&span class=&n&&content_weight&/span&
&span class=&c&&# dimensions of the generated picture.&/span&
&span class=&n&&width&/span&&span class=&p&&,&/span& &span class=&n&&height&/span& &span class=&o&&=&/span& &span class=&n&&load_img&/span&&span class=&p&&(&/span&&span class=&n&&base_image_path&/span&&span class=&p&&)&/span&&span class=&o&&.&/span&&span class=&n&&size&/span&
&span class=&n&&img_nrows&/span& &span class=&o&&=&/span& &span class=&mi&&400&/span&
&span class=&n&&img_ncols&/span& &span class=&o&&=&/span& &span class=&nb&&int&/span&&span class=&p&&(&/span&&span class=&n&&width&/span& &span class=&o&&*&/span& &span class=&n&&img_nrows&/span& &span class=&o&&/&/span& &span class=&n&&height&/span&&span class=&p&&)&/span&
&span class=&c&&# util function to open, resize and format pictures into appropriate tensors&/span&
&span class=&k&&def&/span& &span class=&nf&&preprocess_image&/span&&span class=&p&&(&/span&&span class=&n&&image_path&/span&&span class=&p&&):&/span&
&span class=&n&&img&/span& &span class=&o&&=&/span& &span class=&n&&load_img&/span&&span class=&p&&(&/span&&span class=&n&&image_path&/span&&span class=&p&&,&/span& &span class=&n&&target_size&/span&&span class=&o&&=&/span&&span class=&p&&(&/span&&span class=&n&&img_nrows&/span&&span class=&p&&,&/span& &span class=&n&&img_ncols&/span&&span class=&p&&))&/span&
&span class=&n&&img&/span& &span class=&o&&=&/span& &span class=&n&&img_to_array&/span&&span class=&p&&(&/span&&span class=&n&&img&/span&&span class=&p&&)&/span&
&span class=&n&&img&/span& &span class=&o&&=&/span& &span class=&n&&np&/span&&span class=&o&&.&/span&&span class=&n&&expand_dims&/span&&span class=&p&&(&/span&&span class=&n&&img&/span&&span class=&p&&,&/span& &span class=&n&&axis&/span&&span class=&o&&=&/span&&span class=&mi&&0&/span&&span class=&p&&)&/span&
&span class=&n&&img&/span& &span class=&o&&=&/span& &span class=&n&&vgg16&/span&&span class=&o&&.&/span&&span class=&n&&preprocess_input&/span&&span class=&p&&(&/span&&span class=&n&&img&/span&&span class=&p&&)&/span&
&span class=&k&&return&/span& &span class=&n&&img&/span&
&span class=&c&&# util function to convert a tensor into a valid image&/span&
&span class=&k&&def&/span& &span class=&nf&&deprocess_image&/span&&span class=&p&&(&/span&&span class=&n&&x&/span&&span class=&p&&):&/span&
&span class=&k&&if&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&image_data_format&/span&&span class=&p&&()&/span& &span class=&o&&==&/span& &span class=&s&&'channels_first'&/span&&span class=&p&&:&/span&
&span class=&n&&x&/span& &span class=&o&&=&/span& &span class=&n&&x&/span&&span class=&o&&.&/span&&span class=&n&&reshape&/span&&span class=&p&&((&/span&&span class=&mi&&3&/span&&span class=&p&&,&/span& &span class=&n&&img_nrows&/span&&span class=&p&&,&/span& &span class=&n&&img_ncols&/span&&span class=&p&&))&/span&
&span class=&n&&x&/span& &span class=&o&&=&/span& &span class=&n&&x&/span&&span class=&o&&.&/span&&span class=&n&&transpose&/span&&span class=&p&&((&/span&&span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&mi&&2&/span&&span class=&p&&,&/span& &span class=&mi&&0&/span&&span class=&p&&))&/span&
&span class=&k&&else&/span&&span class=&p&&:&/span&
&span class=&n&&x&/span& &span class=&o&&=&/span& &span class=&n&&x&/span&&span class=&o&&.&/span&&span class=&n&&reshape&/span&&span class=&p&&((&/span&&span class=&n&&img_nrows&/span&&span class=&p&&,&/span& &span class=&n&&img_ncols&/span&&span class=&p&&,&/span& &span class=&mi&&3&/span&&span class=&p&&))&/span&
&span class=&c&&# Remove zero-center by mean pixel&/span&
&span class=&n&&x&/span&&span class=&p&&[:,&/span& &span class=&p&&:,&/span& &span class=&mi&&0&/span&&span class=&p&&]&/span& &span class=&o&&+=&/span& &span class=&mf&&103.939&/span&
&span class=&n&&x&/span&&span class=&p&&[:,&/span& &span class=&p&&:,&/span& &span class=&mi&&1&/span&&span class=&p&&]&/span& &span class=&o&&+=&/span& &span class=&mf&&116.779&/span&
&span class=&n&&x&/span&&span class=&p&&[:,&/span& &span class=&p&&:,&/span& &span class=&mi&&2&/span&&span class=&p&&]&/span& &span class=&o&&+=&/span& &span class=&mf&&123.68&/span&
&span class=&c&&# 'BGR'-&'RGB'&/span&
&span class=&n&&x&/span& &span class=&o&&=&/span& &span class=&n&&x&/span&&span class=&p&&[:,&/span& &span class=&p&&:,&/span& &span class=&p&&::&/span&&span class=&o&&-&/span&&span class=&mi&&1&/span&&span class=&p&&]&/span&
&span class=&n&&x&/span& &span class=&o&&=&/span& &span class=&n&&np&/span&&span class=&o&&.&/span&&span class=&n&&clip&/span&&span class=&p&&(&/span&&span class=&n&&x&/span&&span class=&p&&,&/span& &span class=&mi&&0&/span&&span class=&p&&,&/span& &span class=&mi&&255&/span&&span class=&p&&)&/span&&span class=&o&&.&/span&&span class=&n&&astype&/span&&span class=&p&&(&/span&&span class=&s&&'uint8'&/span&&span class=&p&&)&/span&
&span class=&k&&return&/span& &span class=&n&&x&/span&
&span class=&c&&# get tensor representations of our images&/span&
&span class=&n&&base_image&/span& &span class=&o&&=&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&variable&/span&&span class=&p&&(&/span&&span class=&n&&preprocess_image&/span&&span class=&p&&(&/span&&span class=&n&&base_image_path&/span&&span class=&p&&))&/span&
&span class=&n&&style_reference_image&/span& &span class=&o&&=&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&variable&/span&&span class=&p&&(&/span&&span class=&n&&preprocess_image&/span&&span class=&p&&(&/span&&span class=&n&&style_reference_image_path&/span&&span class=&p&&))&/span&
&span class=&c&&# this will contain our generated image&/span&
&span class=&k&&if&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&image_data_format&/span&&span class=&p&&()&/span& &span class=&o&&==&/span& &span class=&s&&'channels_first'&/span&&span class=&p&&:&/span&
&span class=&n&&combination_image&/span& &span class=&o&&=&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&placeholder&/span&&span class=&p&&((&/span&&span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&mi&&3&/span&&span class=&p&&,&/span& &span class=&n&&img_nrows&/span&&span class=&p&&,&/span& &span class=&n&&img_ncols&/span&&span class=&p&&))&/span&
&span class=&k&&else&/span&&span class=&p&&:&/span&
&span class=&n&&combination_image&/span& &span class=&o&&=&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&placeholder&/span&&span class=&p&&((&/span&&span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&n&&img_nrows&/span&&span class=&p&&,&/span& &span class=&n&&img_ncols&/span&&span class=&p&&,&/span& &span class=&mi&&3&/span&&span class=&p&&))&/span&
&span class=&c&&# combine the 3 images into a single Keras tensor&/span&
&span class=&n&&input_tensor&/span& &span class=&o&&=&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&concatenate&/span&&span class=&p&&([&/span&&span class=&n&&base_image&/span&&span class=&p&&,&/span&
&span class=&n&&style_reference_image&/span&&span class=&p&&,&/span&
&span class=&n&&combination_image&/span&&span class=&p&&],&/span& &span class=&n&&axis&/span&&span class=&o&&=&/span&&span class=&mi&&0&/span&&span class=&p&&)&/span&
&span class=&c&&# build the VGG16 network with our 3 images as input&/span&
&span class=&c&&# the model will be loaded with pre-trained ImageNet weights&/span&
&span class=&n&&model&/span& &span class=&o&&=&/span& &span class=&n&&vgg16&/span&&span class=&o&&.&/span&&span class=&n&&VGG16&/span&&span class=&p&&(&/span&&span class=&n&&input_tensor&/span&&span class=&o&&=&/span&&span class=&n&&input_tensor&/span&&span class=&p&&,&/span&
&span class=&n&&weights&/span&&span class=&o&&=&/span&&span class=&s&&'imagenet'&/span&&span class=&p&&,&/span& &span class=&n&&include_top&/span&&span class=&o&&=&/span&&span class=&k&&False&/span&&span class=&p&&)&/span&
&span class=&nb&&print&/span&&span class=&p&&(&/span&&span class=&s&&'Model loaded.'&/span&&span class=&p&&)&/span&
&span class=&c&&# get the symbolic outputs of each &key& layer (we gave them unique names).&/span&
&span class=&n&&outputs_dict&/span& &span class=&o&&=&/span& &span class=&nb&&dict&/span&&span class=&p&&([(&/span&&span class=&n&&layer&/span&&span class=&o&&.&/span&&span class=&n&&name&/span&&span class=&p&&,&/span& &span class=&n&&layer&/span&&span class=&o&&.&/span&&span class=&n&&output&/span&&span class=&p&&)&/span& &span class=&k&&for&/span& &span class=&n&&layer&/span& &span class=&ow&&in&/span& &span class=&n&&model&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&p&&])&/span&
&span class=&c&&# compute the neural style loss&/span&
&span class=&c&&# first we need to define 4 util functions&/span&
&span class=&c&&# the gram matrix of an image tensor (feature-wise outer product)&/span&
&span class=&k&&def&/span& &span class=&nf&&gram_matrix&/span&&span class=&p&&(&/span&&span class=&n&&x&/span&&span class=&p&&):&/span&
&span class=&k&&assert&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&ndim&/span&&span class=&p&&(&/span&&span class=&n&&x&/span&&span class=&p&&)&/span& &span class=&o&&==&/span& &span class=&mi&&3&/span&
&span class=&k&&if&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&image_data_format&/span&&span class=&p&&()&/span& &span class=&o&&==&/span& &span class=&s&&'channels_first'&/span&&span class=&p&&:&/span&
&span class=&n&&features&/span& &span class=&o&&=&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&batch_flatten&/span&&span class=&p&&(&/span&&span class=&n&&x&/span&&span class=&p&&)&/span&
&span class=&k&&else&/span&&span class=&p&&:&/span&
&span class=&n&&features&/span& &span class=&o&&=&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&batch_flatten&/span&&span class=&p&&(&/span&&span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&permute_dimensions&/span&&span class=&p&&(&/span&&span class=&n&&x&/span&&span class=&p&&,&/span& &span class=&p&&(&/span&&span class=&mi&&2&/span&&span class=&p&&,&/span& &span class=&mi&&0&/span&&span class=&p&&,&/span& &span class=&mi&&1&/span&&span class=&p&&)))&/span&
&span class=&n&&gram&/span& &span class=&o&&=&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&dot&/span&&span class=&p&&(&/span&&span class=&n&&features&/span&&span class=&p&&,&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&transpose&/span&&span class=&p&&(&/span&&span class=&n&&features&/span&&span class=&p&&))&/span&
&span class=&k&&return&/span& &span class=&n&&gram&/span&
&span class=&c&&# the &style loss& is designed to maintain&/span&
&span class=&c&&# the style of the reference image in the generated image.&/span&
&span class=&c&&# It is based on the gram matrices (which capture style) of&/span&
&span class=&c&&# feature maps from the style reference image&/span&
&span class=&c&&# and from the generated image&/span&
&span class=&k&&def&/span& &span class=&nf&&style_loss&/span&&span class=&p&&(&/span&&span class=&n&&style&/span&&span class=&p&&,&/span& &span class=&n&&combination&/span&&span class=&p&&):&/span&
&span class=&k&&assert&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&ndim&/span&&span class=&p&&(&/span&&span class=&n&&style&/span&&span class=&p&&)&/span& &span class=&o&&==&/span& &span class=&mi&&3&/span&
&span class=&k&&assert&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&ndim&/span&&span class=&p&&(&/span&&span class=&n&&combination&/span&&span class=&p&&)&/span& &span class=&o&&==&/span& &span class=&mi&&3&/span&
&span class=&n&&S&/span& &span class=&o&&=&/span& &span class=&n&&gram_matrix&/span&&span class=&p&&(&/span&&span class=&n&&style&/span&&span class=&p&&)&/span&
&span class=&n&&C&/span& &span class=&o&&=&/span& &span class=&n&&gram_matrix&/span&&span class=&p&&(&/span&&span class=&n&&combination&/span&&span class=&p&&)&/span&
&span class=&n&&channels&/span& &span class=&o&&=&/span& &span class=&mi&&3&/span&
&span class=&n&&size&/span& &span class=&o&&=&/span& &span class=&n&&img_nrows&/span& &span class=&o&&*&/span& &span class=&n&&img_ncols&/span&
&span class=&k&&return&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&sum&/span&&span class=&p&&(&/span&&span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&square&/span&&span class=&p&&(&/span&&span class=&n&&S&/span& &span class=&o&&-&/span& &span class=&n&&C&/span&&span class=&p&&))&/span& &span class=&o&&/&/span& &span class=&p&&(&/span&&span class=&mf&&4.&/span& &span class=&o&&*&/span& &span class=&p&&(&/span&&span class=&n&&channels&/span& &span class=&o&&**&/span& &span class=&mi&&2&/span&&span class=&p&&)&/span& &span class=&o&&*&/span& &span class=&p&&(&/span&&span class=&n&&size&/span& &span class=&o&&**&/span& &span class=&mi&&2&/span&&span class=&p&&))&/span&
&span class=&c&&# an auxiliary loss function&/span&
&span class=&c&&# designed to maintain the &content& of the&/span&
&span class=&c&&# base image in the generated image&/span&
&span class=&k&&def&/span& &span class=&nf&&content_loss&/span&&span class=&p&&(&/span&&span class=&n&&base&/span&&span class=&p&&,&/span& &span class=&n&&combination&/span&&span class=&p&&):&/span&
&span class=&k&&return&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&sum&/span&&span class=&p&&(&/span&&span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&square&/span&&span class=&p&&(&/span&&span class=&n&&combination&/span& &span class=&o&&-&/span& &span class=&n&&base&/span&&span class=&p&&))&/span&
&span class=&c&&# the 3rd loss function, total variation loss,&/span&
&span class=&c&&# designed to keep the generated image locally coherent&/span&
&span class=&k&&def&/span& &span class=&nf&&total_variation_loss&/span&&span class=&p&&(&/span&&span class=&n&&x&/span&&span class=&p&&):&/span&
&span class=&k&&assert&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&ndim&/span&&span class=&p&&(&/span&&span class=&n&&x&/span&&span class=&p&&)&/span& &span class=&o&&==&/span& &span class=&mi&&4&/span&
&span class=&k&&if&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&image_data_format&/span&&span class=&p&&()&/span& &span class=&o&&==&/span& &span class=&s&&'channels_first'&/span&&span class=&p&&:&/span&
&span class=&n&&a&/span& &span class=&o&&=&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&square&/span&&span class=&p&&(&/span&&span class=&n&&x&/span&&span class=&p&&[:,&/span& &span class=&p&&:,&/span& &span class=&p&&:&/span&&span class=&n&&img_nrows&/span& &span class=&o&&-&/span& &span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&p&&:&/span&&span class=&n&&img_ncols&/span& &span class=&o&&-&/span& &span class=&mi&&1&/span&&span class=&p&&]&/span& &span class=&o&&-&/span& &span class=&n&&x&/span&&span class=&p&&[:,&/span& &span class=&p&&:,&/span& &span class=&mi&&1&/span&&span class=&p&&:,&/span& &span class=&p&&:&/span&&span class=&n&&img_ncols&/span& &span class=&o&&-&/span& &span class=&mi&&1&/span&&span class=&p&&])&/span&
&span class=&n&&b&/span& &span class=&o&&=&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&square&/span&&span class=&p&&(&/span&&span class=&n&&x&/span&&span class=&p&&[:,&/span& &span class=&p&&:,&/span& &span class=&p&&:&/span&&span class=&n&&img_nrows&/span& &span class=&o&&-&/span& &span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&p&&:&/span&&span class=&n&&img_ncols&/span& &span class=&o&&-&/span& &span class=&mi&&1&/span&&span class=&p&&]&/span& &span class=&o&&-&/span& &span class=&n&&x&/span&&span class=&p&&[:,&/span& &span class=&p&&:,&/span& &span class=&p&&:&/span&&span class=&n&&img_nrows&/span& &span class=&o&&-&/span& &span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&mi&&1&/span&&span class=&p&&:])&/span&
&span class=&k&&else&/span&&span class=&p&&:&/span&
&span class=&n&&a&/span& &span class=&o&&=&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&square&/span&&span class=&p&&(&/span&&span class=&n&&x&/span&&span class=&p&&[:,&/span& &span class=&p&&:&/span&&span class=&n&&img_nrows&/span& &span class=&o&&-&/span& &span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&p&&:&/span&&span class=&n&&img_ncols&/span& &span class=&o&&-&/span& &span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&p&&:]&/span& &span class=&o&&-&/span& &span class=&n&&x&/span&&span class=&p&&[:,&/span& &span class=&mi&&1&/span&&span class=&p&&:,&/span& &span class=&p&&:&/span&&span class=&n&&img_ncols&/span& &span class=&o&&-&/span& &span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&p&&:])&/span&
&span class=&n&&b&/span& &span class=&o&&=&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&square&/span&&span class=&p&&(&/span&&span class=&n&&x&/span&&span class=&p&&[:,&/span& &span class=&p&&:&/span&&span class=&n&&img_nrows&/span& &span class=&o&&-&/span& &span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&p&&:&/span&&span class=&n&&img_ncols&/span& &span class=&o&&-&/span& &span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&p&&:]&/span& &span class=&o&&-&/span& &span class=&n&&x&/span&&span class=&p&&[:,&/span& &span class=&p&&:&/span&&span class=&n&&img_nrows&/span& &span class=&o&&-&/span& &span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&mi&&1&/span&&span class=&p&&:,&/span& &span class=&p&&:])&/span&
&span class=&k&&return&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&sum&/span&&span class=&p&&(&/span&&span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&pow&/span&&span class=&p&&(&/span&&span class=&n&&a&/span& &span class=&o&&+&/span& &span class=&n&&b&/span&&span class=&p&&,&/span& &span class=&mf&&1.25&/span&&span class=&p&&))&/span&
&span class=&c&&# combine these loss functions into a single scalar&/span&
&span class=&n&&loss&/span& &span class=&o&&=&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&variable&/span&&span class=&p&&(&/span&&span class=&mf&&0.&/span&&span class=&p&&)&/span&
&span class=&n&&layer_features&/span& &span class=&o&&=&/span& &span class=&n&&outputs_dict&/span&&span class=&p&&[&/span&&span class=&s&&'block4_conv2'&/span&&span class=&p&&]&/span&
&span class=&n&&base_image_features&/span& &span class=&o&&=&/span& &span class=&n&&layer_features&/span&&span class=&p&&[&/span&&span class=&mi&&0&/span&&span class=&p&&,&/span& &span class=&p&&:,&/span& &span class=&p&&:,&/span& &span class=&p&&:]&/span&
&span class=&n&&combination_features&/span& &span class=&o&&=&/span& &span class=&n&&layer_features&/span&&span class=&p&&[&/span&&span class=&mi&&2&/span&&span class=&p&&,&/span& &span class=&p&&:,&/span& &span class=&p&&:,&/span& &span class=&p&&:]&/span&
&span class=&n&&loss&/span& &span class=&o&&+=&/span& &span class=&n&&content_weight&/span& &span class=&o&&*&/span& &span class=&n&&content_loss&/span&&span class=&p&&(&/span&&span class=&n&&base_image_features&/span&&span class=&p&&,&/span&
&span class=&n&&combination_features&/span&&span class=&p&&)&/span&
&span class=&n&&feature_layers&/span& &span class=&o&&=&/span& &span class=&p&&[&/span&&span class=&s&&'block1_conv1'&/span&&span class=&p&&,&/span& &span class=&s&&'block2_conv1'&/span&&span class=&p&&,&/span&
&span class=&s&&'block3_conv1'&/span&&span class=&p&&,&/span& &span class=&s&&'block4_conv1'&/span&&span class=&p&&,&/span&
&span class=&s&&'block5_conv1'&/span&&span class=&p&&]&/span&
&span class=&k&&for&/span& &span class=&n&&layer_name&/span& &span class=&ow&&in&/span& &span class=&n&&feature_layers&/span&&span class=&p&&:&/span&
&span class=&n&&layer_features&/span& &span class=&o&&=&/span& &span class=&n&&outputs_dict&/span&&span class=&p&&[&/span&&span class=&n&&layer_name&/span&&span class=&p&&]&/span&
&span class=&n&&style_reference_features&/span& &span class=&o&&=&/span& &span class=&n&&layer_features&/span&&span class=&p&&[&/span&&span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&p&&:,&/span& &span class=&p&&:,&/span& &span class=&p&&:]&/span&
&span class=&n&&combination_features&/span& &span class=&o&&=&/span& &span class=&n&&layer_features&/span&&span class=&p&&[&/span&&span class=&mi&&2&/span&&span class=&p&&,&/span& &span class=&p&&:,&/span& &span class=&p&&:,&/span& &span class=&p&&:]&/span&
&span class=&n&&sl&/span& &span class=&o&&=&/span& &span class=&n&&style_loss&/span&&span class=&p&&(&/span&&span class=&n&&style_reference_features&/span&&span class=&p&&,&/span& &span class=&n&&combination_features&/span&&span class=&p&&)&/span&
&span class=&n&&loss&/span& &span class=&o&&+=&/span& &span class=&p&&(&/span&&span class=&n&&style_weight&/span& &span class=&o&&/&/span& &span class=&nb&&len&/span&&span class=&p&&(&/span&&span class=&n&&feature_layers&/span&&span class=&p&&))&/span& &span class=&o&&*&/span& &span class=&n&&sl&/span&
&span class=&n&&loss&/span& &span class=&o&&+=&/span& &span class=&n&&total_variation_weight&/span& &span class=&o&&*&/span& &span class=&n&&total_variation_loss&/span&&span class=&p&&(&/span&&span class=&n&&combination_image&/span&&span class=&p&&)&/span&
&span class=&c&&# get the gradients of the generated image wrt the loss&/span&
&span class=&n&&grads&/span& &span class=&o&&=&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&gradients&/span&&span class=&p&&(&/span&&span class=&n&&loss&/span&&span class=&p&&,&/span& &span class=&n&&combination_image&/span&&span class=&p&&)&/span&
&span class=&n&&outputs&/span& &span class=&o&&=&/span& &span class=&p&&[&/span&&span class=&n&&loss&/span&&span class=&p&&]&/span&
&span class=&k&&if&/span& &span class=&nb&&isinstance&/span&&span class=&p&&(&/span&&span class=&n&&grads&/span&&span class=&p&&,&/span& &span class=&p&&(&/span&&span class=&nb&&list&/span&&span class=&p&&,&/span& &span class=&nb&&tuple&/span&&span class=&p&&)):&/span&
&span class=&n&&outputs&/span& &span class=&o&&+=&/span& &span class=&n&&grads&/span&
&span class=&k&&else&/span&&span class=&p&&:&/span&
&span class=&n&&outputs&/span&&span class=&o&&.&/span&&span class=&n&&append&/span&&span class=&p&&(&/span&&span class=&n&&grads&/span&&span class=&p&&)&/span&
&span class=&n&&f_outputs&/span& &span class=&o&&=&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&function&/span&&span class=&p&&([&/span&&span class=&n&&combination_image&/span&&span class=&p&&],&/span& &span class=&n&&outputs&/span&&span class=&p&&)&/span&
&span class=&k&&def&/span& &span class=&nf&&eval_loss_and_grads&/span&&span class=&p&&(&/span&&span class=&n&&x&/span&&span class=&p&&):&/span&
&span class=&k&&if&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&image_data_format&/span&&span class=&p&&()&/span& &span class=&o&&==&/span& &span class=&s&&'channels_first'&/span&&span class=&p&&:&/span&
&span class=&n&&x&/span& &span class=&o&&=&/span& &span class=&n&&x&/span&&span class=&o&&.&/span&&span class=&n&&reshape&/span&&span class=&p&&((&/span&&span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&mi&&3&/span&&span class=&p&&,&/span& &span class=&n&&img_nrows&/span&&span class=&p&&,&/span& &span class=&n&&img_ncols&/span&&span class=&p&&))&/span&
&span class=&k&&else&/span&&span class=&p&&:&/span&
&span class=&n&&x&/span& &span class=&o&&=&/span& &span class=&n&&x&/span&&span class=&o&&.&/span&&span class=&n&&reshape&/span&&span class=&p&&((&/span&&span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&n&&img_nrows&/span&&span class=&p&&,&/span& &span class=&n&&img_ncols&/span&&span class=&p&&,&/span& &span class=&mi&&3&/span&&span class=&p&&))&/span&
&span class=&n&&outs&/span& &span class=&o&&=&/span& &span class=&n&&f_outputs&/span&&span class=&p&&([&/span&&span class=&n&&x&/span&&span class=&p&&])&/span&
&span class=&n&&loss_value&/span& &span class=&o&&=&/span& &span class=&n&&outs&/span&&span class=&p&&[&/span&&span class=&mi&&0&/span&&span class=&p&&]&/span&
&span class=&k&&if&/span& &span class=&nb&&len&/span&&span class=&p&&(&/span&&span class=&n&&outs&/span&&span class=&p&&[&/span&&span class=&mi&&1&/span&&span class=&p&&:])&/span& &span class=&o&&==&/span& &span class=&mi&&1&/span&&span class=&p&&:&/span&
&span class=&n&&grad_values&/span& &span class=&o&&=&/span& &span class=&n&&outs&/span&&span class=&p&&[&/span&&span class=&mi&&1&/span&&span class=&p&&]&/span&&span class=&o&&.&/span&&span class=&n&&flatten&/span&&span class=&p&&()&/span&&span class=&o&&.&/span&&span class=&n&&astype&/span&&span class=&p&&(&/span&&span class=&s&&'float64'&/span&&span class=&p&&)&/span&
&span class=&k&&else&/span&&span class=&p&&:&/span&
&span class=&n&&grad_values&/span& &span class=&o&&=&/span& &span class=&n&&np&/span&&span class=&o&&.&/span&&span class=&n&&array&/span&&span class=&p&&(&/span&&span class=&n&&outs&/span&&span class=&p&&[&/span&&span class=&mi&&1&/span&&span class=&p&&:])&/span&&span class=&o&&.&/span&&span class=&n&&flatten&/span&&span class=&p&&()&/span&&span class=&o&&.&/span&&span class=&n&&astype&/span&&span class=&p&&(&/span&&span class=&s&&'float64'&/span&&span class=&p&&)&/span&
&span class=&k&&return&/span& &span class=&n&&loss_value&/span&&span class=&p&&,&/span& &span class=&n&&grad_values&/span&
&span class=&c&&# this Evaluator class makes it possible&/span&
&span class=&c&&# to compute loss and gradients in one pass&/span&
&span class=&c&&# while retrieving them via two separate functions,&/span&
&span class=&c&&# &loss& and &grads&. This is done because scipy.optimize&/span&
&span class=&c&&# requires separate functions for loss and gradients,&/span&
&span class=&c&&# but computing them separately would be inefficient.&/span&
&span class=&k&&class&/span& &span class=&nc&&Evaluator&/span&&span class=&p&&(&/span&&span class=&nb&&object&/span&&span class=&p&&):&/span&
&span class=&k&&def&/span& &span class=&nf&&__init__&/span&&span class=&p&&(&/span&&span class=&bp&&self&/span&&span class=&p&&):&/span&
&span class=&bp&&self&/span&&span class=&o&&.&/span&&span class=&n&&loss_value&/span& &span class=&o&&=&/span& &span class=&k&&None&/span&
&span class=&bp&&self&/span&&span class=&o&&.&/span&&span class=&n&&grads_values&/span& &span class=&o&&=&/span& &span class=&k&&None&/span&
&span class=&k&&def&/span& &span class=&nf&&loss&/span&&span class=&p&&(&/span&&span class=&bp&&self&/span&&span class=&p&&,&/span& &span class=&n&&x&/span&&span class=&p&&):&/span&
&span class=&k&&assert&/span& &span class=&bp&&self&/span&&span class=&o&&.&/span&&span class=&n&&loss_value&/span& &span class=&ow&&is&/span& &span class=&k&&None&/span&
&span class=&n&&loss_value&/span&&span class=&p&&,&/span& &span class=&n&&grad_values&/span& &span class=&o&&=&/span& &span class=&n&&eval_loss_and_grads&/span&&span class=&p&&(&/span&&span class=&n&&x&/span&&span class=&p&&)&/span&
&span class=&bp&&self&/span&&span class=&o&&.&/span&&span class=&n&&loss_value&/span& &span class=&o&&=&/span& &span class=&n&&loss_value&/span&
&span class=&bp&&self&/span&&span class=&o&&.&/span&&span class=&n&&grad_values&/span& &span class=&o&&=&/span& &span class=&n&&grad_values&/span&
&span class=&k&&return&/span& &span class=&bp&&self&/span&&span class=&o&&.&/span&&span class=&n&&loss_value&/span&
&span class=&k&&def&/span& &span class=&nf&&grads&/span&&span class=&p&&(&/span&&span class=&bp&&self&/span&&span class=&p&&,&/span& &span class=&n&&x&/span&&span class=&p&&):&/span&
&span class=&k&&assert&/span& &span class=&bp&&self&/span&&span class=&o&&.&/span&&span class=&n&&loss_value&/span& &span class=&ow&&is&/span& &span class=&ow&&not&/span& &span class=&k&&None&/span&
&span class=&n&&grad_values&/span& &span class=&o&&=&/span& &span class=&n&&np&/span&&span class=&o&&.&/span&&span class=&n&&copy&/span&&span class=&p&&(&/span&&span class=&bp&&self&/span&&span class=&o&&.&/span&&span class=&n&&grad_values&/span&&span class=&p&&)&/span&
&span class=&bp&&self&/span&&span class=&o&&.&/span&&span class=&n&&loss_value&/span& &span class=&o&&=&/span& &span class=&k&&None&/span&
&span class=&bp&&self&/span&&span class=&o&&.&/span&&span class=&n&&grad_values&/span& &span class=&o&&=&/span& &span class=&k&&None&/span&
&span class=&k&&return&/span& &span class=&n&&grad_values&/span&
&span class=&n&&evaluator&/span& &span class=&o&&=&/span& &span class=&n&&Evaluator&/span&&span class=&p&&()&/span&
&span class=&c&&# run scipy-based optimization (L-BFGS) over the pixels of the generated image&/span&
&span class=&c&&# so as to minimize the neural style loss&/span&
&span class=&k&&if&/span& &span class=&n&&K&/span&&span class=&o&&.&/span&&span class=&n&&image_data_format&/span&&span class=&p&&()&/span& &span class=&o&&==&/span& &span class=&s&&'channels_first'&/span&&span class=&p&&:&/span&
&span class=&n&&x&/span& &span class=&o&&=&/span& &span class=&n&&np&/span&&span class=&o&&.&/span&&span class=&n&&random&/span&&span class=&o&&.&/span&&span class=&n&&uniform&/span&&span class=&p&&(&/span&&span class=&mi&&0&/span&&span class=&p&&,&/span& &span class=&mi&&255&/span&&span class=&p&&,&/span& &span class=&p&&(&/span&&span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&mi&&3&/span&&span class=&p&&,&/span& &span class=&n&&img_nrows&/span&&span class=&p&&,&/span& &span class=&n&&img_ncols&/span&&span class=&p&&))&/span& &span class=&o&&-&/span& &span class=&mf&&128.&/span&
&span class=&k&&else&/span&&span class=&p&&:&/span&
&span class=&n&&x&/span& &span class=&o&&=&/span& &span class=&n&&np&/span&&span class=&o&&.&/span&&span class=&n&&random&/span&&span class=&o&&.&/span&&span class=&n&&uniform&/span&&span class=&p&&(&/span&&span class=&mi&&0&/span&&span class=&p&&,&/span& &span class=&mi&&255&/span&&span class=&p&&,&/span& &span class=&p&&(&/span&&span class=&mi&&1&/span&&span class=&p&&,&/span& &span class=&n&&img_nrows&/span&&span class=&p&&,&/span& &span class=&n&&img_ncols&/span&&span class=&p&&,&/span& &span class=&mi&&3&/span&&span class=&p&&))&/span& &span class=&o&&-&/span& &span class=&mf&&128.&/span&
&span class=&k&&for&/span& &span class=&n&&i&/span& &span class=&ow&&in&/span& &span class=&nb&&range&/span&&span class=&p&&(&/span&&span class=&n&&iterations&/span&&span class=&p&&):&/span&
&span class=&nb&&print&/span&&span class=&p&&(&/span&&span class=&s&&'Start of iteration'&/span&&span class=&p&&,&/span& &span class=&n&&i&/span&&span class=&p&&)&/span&
&span class=&n&&start_time&/span& &span class=&o&&=&/span& &span class=&n&&time&/span&&span class=&o&&.&/span&&span class=&n&&time&/span&&span class=&p&&()&/span&
&span class=&n&&x&/span&&span class=&p&&,&/span& &span class=&n&&min_val&/span&&span class=&p&&,&/span& &span class=&n&&info&/span& &span class=&o&&=&/span& &span class=&n&&fmin_l_bfgs_b&/span&&span class=&p&&(&/span&&span class=&n&&evaluator&/span&&span class=&o&&.&/span&&span class=&n&&loss&/span&&span class=&p&&,&/span& &span class=&n&&x&/span&&span class=&o&&.&/span&&span class=&n&&flatten&/span&&span class=&p&&(),&/span&
&span class=&n&&fprime&/span&&span class=&o&&=&/span&&span class=&n&&evaluator&/span&&span class=&o&&.&/span&&span class=&n&&grads&/span&&span class=&p&&,&/span& &span class=&n&&maxfun&/span&&span class=&o&&=&/span&&span class=&mi&&20&/span&&span class=&p&&)&/span&
&span class=&nb&&print&/span&&span class=&p&&(&/span&&span class=&s&&'Current loss value:'&/span&&span class=&p&&,&/span& &span class=&n&&min_val&/span&&span class=&p&&)&/span&
&span class=&c&&# save current generated image&/span&
&span class=&n&&img&/span& &span class=&o&&=&/span& &span class=&n&&deprocess_image&/span&&span class=&p&&(&/span&&span class=&n&&x&/span&&span class=&o&&.&/span&&span class=&n&&copy&/span&&span class=&p&&())&/span&
&span class=&n&&fname&/span& &span class=&o&&=&/span& &span class=&n&&result_prefix&/span& &span class=&o&&+&/span& &span class=&s&&'_at_iteration_%d.png'&/span& &span class=&o&&%&/span& &span class=&n&&i&/span&
&span class=&n&&imsave&/span&&span class=&p&&(&/span&&span class=&n&&fname&/span&&span class=&p&&,&/span& &span class=&n&&img&/span&&span class=&p&&)&/span&
&span class=&n&&end_time&/span& &span class=&o&&=&/span& &span class=&n&&time&/span&&span class=&o&&.&/span&&span class=&n&&time&/span&&span class=&p&&()&/span&
&span class=&nb&&print&/span&&span class=&p&&(&/span&&span class=&s&&'Image saved as'&/span&&span class=&p&&,&/span& &span class=&n&&fname&/span&&span class=&p&&)&/span&
&span class=&nb&&print&/span&&span class=&p&&(&/span&&span class=&s&&'Iteration %d completed in %ds'&/span& &span class=&o&&%&/span& &span class=&p&&(&/span&&span class=&n&&i&/span&&span class=&p&&,&/span& &span class=&n&&end_time&/span& &span class=&o&&-&/span& &span class=&n&&start_time&/span&&span class=&p&&))&/span&
&/code&&/pre&&/div&&p&复制上述代码保存为neural_style_transfer.py(随便命名)&/p&&h2&&b&4.运行&/b&&/h2&&p&新建一个空文件夹,把上一步骤的文件neural_style_transfer.py放入这个空文件夹中。然后把相应的模板图片,待转化图片放入该文件当中。&/p&&div class=&highlight&&&pre&&code class=&language-python3&&&span class=&n&&python&/span& &span class=&n&&neural_style_transfer&/span&&span class=&o&&.&/span&&span class=&n&&py&/span&
&span class=&n&&你的待转化图片路径&/span&
&span class=&n&&模板图片路径&/span&
&span class=&n&&保存的生产图片路径加名称&/span&&span class=&err&&(&/span&&span class=&n&&注意不需要有&/span&&span class=&o&&.&/span&&span class=&n&&jpg等后缀&/span&&span class=&err&&)&/span&
&span class=&n&&python&/span& &span class=&n&&neural_style_transfer&/span&&span class=&o&&.&/span&&span class=&n&&py&/span& &span class=&s&&'./me.jpg'&/span& &span class=&s&&'./starry_night.jpg'&/span& &span class=&s&&'./me_t'&/span&
&/code&&/pre&&/div&&p&迭代结果截图:&/p&&img src=&/v2-7bd28bae3b546a72d56e0e3d_b.png& data-rawwidth=&484& data-rawheight=&595& class=&origin_image zh-lightbox-thumb& width=&484& data-original=&/v2-7bd28bae3b546a72d56e0e3d_r.png&&&h2&&b&迭代过程对比&/b&&/h2&&img src=&/v2-325064dfc72ba4f87ef1e6_b.png& data-rawwidth=&778& data-rawheight=&496& class=&origin_image zh-lightbox-thumb& width=&778& data-original=&/v2-325064dfc72ba4f87ef1e6_r.png&&&br&&h2&&b&其它库实现风格转化&/b&&/h2&&p&基于python深度学习库DeepPy的实现:&a href=&///?target=https%3A///andersbll/neural_artistic_style& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&GitHub - andersbll/neural_artistic_style: Neural Artistic Style in Python&i class=&icon-external&&&/i&&/a&&/p&&p&基于python深度学习库TensorFlow的实现:&a href=&///?target=https%3A///anishathalye/neural-style& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&GitHub - anishathalye/neural-style: Neural style in TensorFlow!&i class=&icon-external&&&/i&&/a&&/p&&p&基于python深度学习库Caffe的实现:&a href=&///?target=https%3A///fzliu/style-transfer& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&https://&/span&&span class=&visible&&/fzliu/style-&/span&&span class=&invisible&&transfer&/span&&span class=&ellipsis&&&/span&&i class=&icon-external&&&/i&&/a&&/p&
可以画画啊!可以画画啊!可以画画啊! 对,有趣的事情需要讲三遍。
事情是这样的,通过python的深度学习算法包去训练计算机模仿世界名画的风格,然后应用到另一幅画中,不多说直接上图! 这个是世界名画”毕加索的自画像“(我也不懂什么是世界名画,但是…
自信开朗,感情协调,多吃橘子和葡萄…啊不是这个GC。&br&&br&再放一次这个传送门:&a href=&/question//answer/& class=&internal&&这段 Java 代码中的局部变量能够被提前回收吗?编译器或 VM 能够实现如下的人工优化吗? - RednaxelaFX 的回答&/a&&br&&- 请先看了这个传送门,了解JVM到底能做什么不能做什么,再说。&br&&br&Java怎样的编程习惯有利于GC,这硬要扣细节的话一定要结合具体的JVM实现才可以分析,因为各个JVM甚至同一个JVM里的各个GC实现都会有不同的特点。&br&但通用的、通常管用的建议,其实很简单:&br&&ul&&li&写简单直观的代码,不要玩花招。过分设计、过多的封装/抽象层,常常会让GC很难受(导致需要处理的对象增多)。&/li&&li&要理解:GC是伙伴,不是仆人。在保持代码结构良好、直观易懂的前提下,减少没必要的对象分配总是好的。&/li&&li&不要调用System.gc() &- 可能影响GC的统计数据和未来决策&/li&&li&不要随意使用“对象池” &- 为了优化GC而使用对象池常常是非常有害的。为了别的有用的目的,例如说持有初始化开销高的资源而使用对象池,这才是通常可取的场景。&/li&&li&通常不用关心对局部变量置null &- 开头的传送门有详细讲解&/li&&li&小心使用ThreadLocal,特别是当跟线程池搭配使用的时候 &- 如果用线程池来跑任务,而这些任务向ThreadLocal写入了数据,那么应该注意在任务完成时清理ThreadLocal,不然容易泄漏&/li&&li&如果使用堆外内存来实现Java对象的缓存,而且在堆外内存里存的是序列化后的Java对象的话,要小心使用时的反序列化开销及其伴随的频繁创建对象的开销。&/li&&li&如果程序里有使用NIO,要关注DirectByteBuffer的使用状况;例如说如果禁用了System.gc()并且程序调优过使得GC频率非常低的话,死掉的DirectByteBuffer可能会得不到及时的释放。请参考这个传送门的第1点:&a href=&///?target=http%3A//hllvm./group/topic/27945& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&[HotSpot VM] JVM调优的&标准参数&的各种陷阱&i class=&icon-external&&&/i&&/a&&/li&&li&经常查看与分析GC日志(或者通过别的方式,例如JMX,来做类似的监控) &- 没问题就别乱优化,有问题要及时发现和分析&/li&&/ul&&br&关于GC是伙伴不是仆人:意思是说,虽然很偶尔会遇到GC自身有bug而导致内存泄漏,但一般来说还是可以信任JVM的GC可以收集程序不需要的所有垃圾对象的。但这应该是一个双向沟通(伙伴)的模型,而不是一个单向发出命令(仆人)的模型。我们可以写程序,GC会知道要收集哪些对象;反过来,GC会给我们反馈(GC日志、JMX监控,等等),告诉我们它表现得如何,是否需要我们的帮助来改进它的表现。&br&&br&上面是应用层面的一些建议,然后剩下的就是GC调优了。那又是另一回事,不是“编程习惯”了。
自信开朗,感情协调,多吃橘子和葡萄…啊不是这个GC。 再放一次这个传送门: &- 请先看了这个传送门,了解JVM到底能做什么不能做什么,再说。 Ja…
&b&请跟我默念一句: 人生苦短, JetBrains大法好!&/b&&br&&br&JetBrains的 IDE 的理念: 所有的开发工作, 都只需要打开JetBrains家的 IDE, 且仅需要打开它!&br&JetBrains 家的 IDE 跨平台, 支持 C,C++, Java, Python, PHP, Ruby, 等主流开发语言.&br&&br&&b&满足题主需求的 IDE 是 Clion.&/b&&br&&a href=&///?target=https%3A///clion/& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&https://www.&/span&&span class=&visible&&/clion/&/span&&span class=&invisible&&&/span&&i class=&icon-external&&&/i&&/a&&br&&br&&br&如下配图说明:&br&&blockquote&1.
本机 Mac 下打开 PyCharm.&br&2. 打开 FTP 插件, 连接阿里云主机(centos 6).&br&3. 打开 SSH 插件, 连接阿里云主机.&br&4. FTP 插件, 用于打开远程代码文件, 作编辑,修改,以及同步操作.&br&5. SSH 插件, 远程运行,调试.&/blockquote&&b&至于 Clion, IDEA, RubyMine, PHPStorm, 所有操作体验,类似.&/b&&br&&img src=&/5f63c162f317b7bf8af6cd_b.png& data-rawwidth=&1746& data-rawheight=&1090& class=&origin_image zh-lightbox-thumb& width=&1746& data-original=&/5f63c162f317b7bf8af6cd_r.png&&&br&&br&适合场景:&br&&blockquote&1. 题主的C,C++嵌入式开发, 经常 看到很多人是 Windows 下用 source insight, 虚拟机配置 samba, 挂载到 windows 里作编辑, 看他们编辑的无比蛋疼.&br&&br&2. web 服务器端开发. 偶尔会遇到我这里截图的场景, 需要在公网云主机上写一些测试代码, 利用公网, 本地改改改, 再上传, 很蛋疼, 云主机上通常是没配置的 vim, 用起来无比不爽. 所以,还是老朋友 pycharm给力.&/blockquote&&br&广告时间:&br&&br&&blockquote&你要用终端? 自带插件!&br&你要连数据库? sql,还是 nosql, 插件都有!&br&你要 SSH 登录, 插件有!&br&你要 FTP 登录, 插件有!&br&你要用 git,svn,hg, 插件全支持!&br&你想作 git 提交,查看提交日志,插件有!&br&你要用 docker, vagrant,
有有有!&/blockquote&其他黑科技, 请慢慢发觉...&br&&br&&br&篇外:&br&&br&&blockquote&这年头, 还是有很多二逼喜欢炫耀用 vim 开发, 其实, 很多二逼不知道: 同样是用 vim, 大神手里的 Vim 和 二逼手里的完全不一样. &br&大神手里的 Vim 已经配置的跟 IDE 没差.&br&而很多二逼的 Vim 没任何配置,
吭哧吭哧的在那跟用记事本写代码差不多, 还自我感觉良好. &br&&br&Vim 有很多牛逼的功能,只是上手难度有点高.&br& IDE 就是降低了门槛,让你轻松愉快的使用很多牛逼的功能.&br&&br&前阵子某个技术群, 有2B 说: 所有收费的 IDE 都是垃圾, 实在懒得喷.&br&归根到底: 就是见识浅.&/blockquote&&br&&b&JetBrains 家的 IDE, 有太多强大的功能, 真正良心.&/b&&br&&br&&b&另外, 我猜测 eclipse, Visual Studio 应该有类似的功能, 感兴趣的童鞋,可以自行探索.&/b&&br&&br&&br&&b&再次默念: &/b&&br&&b&JetBrains大法好!&/b&&br&&b&JetBrains大法好!&/b&&br&&b&JetBrains大法好!&/b&
请跟我默念一句: 人生苦短, JetBrains大法好! JetBrains的 IDE 的理念: 所有的开发工作, 都只需要打开JetBrains家的 IDE, 且仅需要打开它! JetBrains 家的 IDE 跨平台, 支持 C,C++, Java, Python, PHP, Ruby, 等主流开发语言. 满足题主需求的 IDE 是 Clion. …
&div class=&highlight&&&pre&&code class=&language-js&&&span class=&nx&&$scope&/span&&span class=&p&&.&/span&&span class=&nx&&showYouMyHeart&/span& &span class=&o&&=&/span& &span class=&kd&&function&/span& &span class=&p&&()&/span& &span class=&p&&{&/span&
&span class=&k&&if&/span& &span class=&p&&(&/span&
&span class=&nx&&$scope&/span&&span class=&p&&.&/span&&span class=&nx&&theWayYouAre&/span&&span class=&p&&()&/span& &span class=&o&&||&/span&
&span class=&nx&&LoveFormService&/span&&span class=&p&&.&/span&&span class=&nx&&dataHasError&/span&&span class=&p&&(&/span&&span class=&nx&&$scope&/span&&span class=&p&&.&/span&&span class=&nx&&Data&/span&&span class=&p&&[&/span&&span class=&nx&&$scope&/span&&span class=&p&&.&/span&&span class=&nx&&selectedlLoveType&/span&&span class=&p&&.&/span&&span class=&nx&&value&/span&&span class=&p&&])&/span&
&span class=&p&&)&/span& &span class=&p&&{&/span&
&span class=&k&&return&/span& &span class=&kc&&false&/span&&span class=&p&&;&/span&
&span class=&p&&}&/span&
&/code&&/pre&&/div&&div class=&highlight&&&pre&&code class=&language-js&&&span class=&nx&&$&/span&&span class=&p&&(&/span&&span class=&s1&&'#IWontWaitForYouAnyMore'&/span&&span class=&p&&).&/span&&span class=&nx&&modal&/span&&span class=&p&&();&/span&
&span class=&nx&&$scope&/span&&span class=&p&&.&/span&&span class=&nx&&allAboutYou&/span& &span class=&o&&=&/span& &span class=&kc&&true&/span&&span class=&p&&;&/span&
&span class=&nx&&$scope&/span&&span class=&p&&.&/span&&span class=&nx&&stillWaiting&/span& &span class=&o&&=&/span& &span class=&kc&&true&/span&&span class=&p&&;&/span&
&span class=&nx&&$timeout&/span&&span class=&p&&(&/span&&span class=&kd&&function&/span&&span class=&p&&()&/span& &span class=&p&&{&/span&
&span class=&nx&&$scope&/span&&span class=&p&&.&/span&&span class=&nx&&stillWaiting&/span& &span class=&o&&=&/span& &span class=&kc&&false&/span&&span class=&p&&;&/span&
&span class=&nx&&$scope&/span&&span class=&p&&.&/span&&span class=&nx&&allAboutYou&/span& &span class=&o&&=&/span& &span class=&kc&&false&/span&&span class=&p&&;&/span&
&span class=&nx&&$scope&/span&&span class=&p&&.&/span&&span class=&nx&&meBeforeYou&/span& &span class=&o&&=&/span& &span class=&kc&&true&/span&&span class=&p&&;&/span&
&span class=&nx&&$scope&/span&&span class=&p&&.&/span&&span class=&nx&&goodBye&/span& &span class=&o&&=&/span& &span class=&kc&&true&/span&&span class=&p&&;&/span&
&span class=&p&&},&/span& &span class=&mi&&000&/span& &span class=&p&&);&/span&
&span class=&nx&&$scope&/span&&span class=&p&&.&/span&&span class=&nx&&hideIWontWaitForYouAnyMore&/span& &span class=&o&&=&/span& &span class=&kd&&function&/span&&span class=&p&&(){&/span&
&span class=&nx&&$&/span&&span class=&p&&(&/span&&span class=&s1&&'#IWontWaitForYouAnyMore'&/span&&span class=&p&&).&/span&&span class=&nx&&hide&/span&&span class=&p&&();&/span&
&span class=&nx&&$scope&/span&&span class=&p&&.&/span&&span class=&nx&&IloveYou&/span& &span class=&o&&=&/span& &span class=&kc&&false&/span&&span class=&p&&;&/span&
&span class=&p&&};&/span&
&/code&&/pre&&/div&&div class=&highlight&&&pre&&code class=&language-jade&&&span class=&nf&&#IWontWaitForYouAnyMore&/span&&span class=&nc&&.modal.fade&/span&(&span class=&na&&tabindex=&/span&&span class=&s&&'-1'&/span&&span class=&err&&,&/span& &span class=&na&&role=&/span&&span class=&s&&'dialog'&/span&&span class=&err&&,&/span& &span class=&na&&aria-labelledby=&/span&&span class=&s&&'myModalLabel'&/span&&span class=&err&&,&/span& &span class=&na&&aria-hidden=&/span&&span class=&s&&'true'&/span&)
&span class=&nc&&.modal-dialog&/span&
&span class=&nc&&.modal-content.no-footer&/span&
&span class=&nc&&.modal-body&/span&
&span class=&nc&&.wait&/span&
&span class=&nt&&p&/span&&span class=&nc&&.desc&/span&(&span class=&na&&ng-show=&/span&&span class=&s&&'allAboutYou'&/span&) 我的意中人是个盖世英雄,有一天他会踩着七色的云彩来娶我。
&span class=&nt&&p&/span&&span class=&nc&&.desc&/span&(&span class=&na&&ng-show=&/span&&span class=&s&&'meBeforeYou'&/span&) 我猜中了前头,可是我猜不着这结局。
&span class=&nt&&spinner&/span&(&span class=&na&&loading=&/span&&span class=&s&&'stillWaiting'&/span&)
&span class=&nt&&button&/span&&span class=&nc&&.btn.btn-primary&/span&(&span class=&na&&ng-show=&/span&&span class=&s&&'goodBye'&/span&&span class=&err&&,&/span&&span class=&na&&ng-click=&/span&&span class=&s&&'hideIWontWaitForYouAnyMore()'&/span&) 再见了。
&/code&&/pre&&/div&&br&000ms,一万年。&br&&br&&br&「还等吗?」&br&&br&&不等了。&&br&&br&「为什么?」&br&&br&&凭什么?&
$scope.showYouMyHeart = function () {
$scope.theWayYouAre() || LoveFormService.dataHasError($scope.Data[$scope.selectedlLoveType.value])
}$('#IWontWaitForYouAnyMore').modal();
$scope.allAboutYou =
$scope.still…
已经有很多答案说得挺好的了。&br&&br&但是,有好多纸上谈兵的马谡后代是很厉害的,我就曾经深受其害。你如果只问例如HashMap是否线程安全之类的问题,很容易就被糊弄了。即便问实现细节,有些理论控也能搞定,因为有些书写得蛮详尽的,只要是会背书的人,就可以搞定这种问题了,甚至可以没有任何编码经验。&br&&br&本着理论结合实践的方法,我一般都不问上面这种纯知识和理论性问题,而是让他写一段程序来证明HashMap是线程不安全的。然后,再让改一下这个程序,使之线程安全。&br&&br&在面试等情况下,如果有条件,最好能给被考核人一台可以上Google的电脑,然后给其一段自己的空间和时间,比如一个人在小会议室里10-20分钟。然后,另接一台显示器,同步观察他在电脑上是怎么做的。由此可以看出他解决问题的方法、思路。当然,另接显示器这事儿,得提前知会对方。&br&&br&或者,如果为了省时间,拿出写好的线程不安全的程序,让其说出为啥会有这样奇怪的现象。&br&&br&以此类推,可以让写证明List里存储的是引用还是对象实体的程序,写证明弱引用强引用区别的程序,写证明某种设计模式效果的程序…………&br&这方法,不仅可以检查基础知识,还可以检查思维能力。如果对知识理解得不够透彻,这种证明程序其实是挺难写的……&br&&br&这方法更有威力的地方在于,你可以不是很懂Java,仍然可以大致衡量出对方的水平。&br&比如,你看不懂对方的程序,你完全可以作为一个学生去请教。真正对概念理解透彻的人,是懂得如何用浅显易懂的方式把一个知识讲明白的,他能把飘在空中的概念一路清清楚楚地给你讲到落在你的脚边,让你觉得唾手可得。能把概念讲到这种程度的人,通常在相关领域不是一般的高手。&br&我面试的时候,有时候会装傻,故意看看对方能否讲明白。遇到一知半解的,会把他自己绕进去,很好玩;遇到高手,真的获益匪浅。当然,也有时候是真不懂,虚心学习,但一边学习一边提问,基本就可以知道对方的水平了。&br&从面试的角度讲,一个能找到比自己强的人才的方法才是最有前途的方法。否则,只能招到比自己水平低的人,那公司的未来就只能局限于领导的能力水平了……&br&&br&=== 高手可忽略部分分割线 ===&br&看到评论区有人说我这个方法太狠,很难过关。&br&我估计这么说的人还是学生吧。一个真正用心工作的有几年工作经验的人,不能说我让写的这类程序都会信手拈来,但至少在比较熟悉的领域里,还是可以略加思索就可以写成的。&br&比如,关于说明线程安全的例子,我十几年前就写过一个(基于List,而非Map的):&br&&a href=&///?target=https%3A///programus/java-experiments/blob/master/refs/JavaTraining/src/com/ibm/javacore/collections/threadsafe/ThreadSafeDemo.java& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&https://&/span&&span class=&visible&&/programus/ja&/span&&span class=&invisible&&va-experiments/blob/master/refs/JavaTraining/src/com/ibm/javacore/collections/threadsafe/ThreadSafeDemo.java&/span&&span class=&ellipsis&&&/span&&i class=&icon-external&&&/i&&/a&&br&总共不到50行的代码,有兴趣的可以参考一下。&br&这类问题,没有标准答案,也往往没有最佳答案。我这个也不见得是什么很好的答案,只是当年为了给公司做新人培训,随便写的一个。(我写的那个时候,还是Java 1.4,List什么的都没有generic支持呢……所以,new ArrayList()也不会有警报……)
已经有很多答案说得挺好的了。 但是,有好多纸上谈兵的马谡后代是很厉害的,我就曾经深受其害。你如果只问例如HashMap是否线程安全之类的问题,很容易就被糊弄了。即便问实现细节,有些理论控也能搞定,因为有些书写得蛮详尽的,只要是会背书的人,就可以搞…
半夜放个截图吧,我正在做的 sparkling 的 spark sql 支持。请无视 repl 里那堆错误信息,我也觉得错的有点儿蠢。&br&用Emacs的话,Haskell/Python/Clojure/Common Lisp ,都可以比较轻松的配置出类似的repl开发环境,把编程、测试、调试、交互式编码整合在一起,这个我个人感觉比代码提示什么的重要。按说还应该有代码/项目结构浏览和导航,才算比较完整,不过我懒得去折腾speedbar了。真要想弄完备了,不如以后有钱了买一套IDE。&br&OSX上,文档有DASH,而且有Emacs的集成支持。Emacs本身跟OSX也结合的不错。这个东西对于OSX上工作的工程师,值得学习。尽管我也觉得它有点儿老了,但是在atom经常不靠谱,vsc刚刚起步,st虽强但是资源仍不够我折腾的今天,要在服务器和我的工作机上配置一套一致性比较好,又比较不折腾的工作环境,Emacs仍然是我最好的选择。&br&&img src=&/5ee07b84eccba2bd2e46b37ebc3a572c_b.png& data-rawwidth=&2880& data-rawheight=&1800& class=&origin_image zh-lightbox-thumb& width=&2880& data-original=&/5ee07b84eccba2bd2e46b37ebc3a572c_r.png&&再附一张此刻的DASH。&br&&img src=&/6f50f427e8ea3b4e157c_b.png& data-rawwidth=&2880& data-rawheight=&1800& class=&origin_image zh-lightbox-thumb& width=&2880& data-original=&/6f50f427e8ea3b4e157c_r.png&&
半夜放个截图吧,我正在做的 sparkling 的 spark sql 支持。请无视 repl 里那堆错误信息,我也觉得错的有点儿蠢。 用Emacs的话,Haskell/Python/Clojure/Common Lisp ,都可以比较轻松的配置出类似的repl开发环境,把编程、测试、调试、交互式编码整合在…
Android技术更新很快,了解和学习最新技术可以提高自己的眼界,并且对以后的项目开发或是未来发展都有很大帮助。&br&在这个回答下,会不定期的分享一些技术,希望相互支持、学习!&br&&br&&b&Android Data Binding&/b&&br&我们可能需要在Activity里写很多的findViewById,稍微复杂一点的界面就要写一大堆的findViewById,重复的代码也增加了我们代码的耦合性,而Data Binding可以让我们抛弃那么多的findViewById。当然,DataBinding并不仅限于此。&br&&br&首先,在应用中的build.gradle中添加&br&android {&br&
dataBinding.enabled = true&br&}&br&&img src=&/b4cc263ff58d40db6987_b.png& data-rawwidth=&290& data-rawheight=&142& class=&content_image& width=&290&&之后,在你的界面文件中,用&layout&&/layout&把已有的布局包裹起来,像下面这样:&br&&img src=&/b5f95c10a_b.png& data-rawwidth=&541& data-rawheight=&218& class=&origin_image zh-lightbox-thumb& width=&541& data-original=&/b5f95c10a_r.png&&然后,在Activity中用下面的方式来绑定数据:&br&&img src=&/2f0a3df47e609414eecff78_b.png& data-rawwidth=&558& data-rawheight=&110& class=&origin_image zh-lightbox-thumb& width=&558& data-original=&/2f0a3df47e609414eecff78_r.png&&这里的&b&ActivityDatabindingBinding&/b&是根据布局文件&b&activity_databinding.xml&/b&文件自动生成的,而&b&xiaoGongJu&/b&大家发现了吧,其实就是控件&b&TextView&/b&的&b&id&/b&。&br&最后,效果图:&br&&img src=&/d160b23cbbfc8b_b.png& data-rawwidth=&390& data-rawheight=&721& class=&content_image& width=&390&&&br&&br&&b&Material Theme UI for Jetbrains&/b&&br&在其他的回答里会有人问,能否说一下IDE使用的主题。虽然这个和技术关系不大,但工欲善其事,必先利其器。&br&&br&这里推荐的主题是这个:&br&&a href=&///?target=https%3A///ChrisRM/material-theme-jetbrains& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&GitHub - ChrisRM/material-theme-jetbrains: JetBrains theme of Material Theme&i class=&icon-external&&&/i&&/a&&br&安装说明:&br&&img src=&/b954e94fee4f015c2e41bf14abe2606a_b.png& data-rawwidth=&911& data-rawheight=&509& class=&origin_image zh-lightbox-thumb& width=&911& data-original=&/b954e94fee4f015c2e41bf14abe2606a_r.png&&&br&教程截图:&br&&img src=&/b92089ceb9b76d998dc208_b.png& data-rawwidth=&1024& data-rawheight=&696& class=&origin_image zh-lightbox-thumb& width=&1024& data-original=&/b92089ceb9b76d998dc208_r.png&&搜索到插件后点击安装,Apply后重启Android Studio就安装成功了。&br&&img src=&/3cc9f7c947d74b87e1cb33aed83beb86_b.png& data-rawwidth=&959& data-rawheight=&471& class=&origin_image zh-lightbox-thumb& width=&959& data-original=&/3cc9f7c947d74b87e1cb33aed83beb86_r.png&&成功后,点击Font,可以看到多出了三个主题来Darker theme、 Default theme、 Lighter theme。&br&你可以根据这三个进行配色、设置。如Material Theme - Darker Font设置当前主题的字体。&br&最后,效果图:&br&&img src=&/bcff6af38_b.png& data-rawwidth=&1366& data-rawheight=&704& class=&origin_image zh-lightbox-thumb& width=&1366& data-original=&/bcff6af38_r.png&&
Android技术更新很快,了解和学习最新技术可以提高自己的眼界,并且对以后的项目开发或是未来发展都有很大帮助。 在这个回答下,会不定期的分享一些技术,希望相互支持、学习! Android Data Binding 我们可能需要在Activity里写很多的findViewById,稍微复…
token就是起这个作用的啊……不就是为了让你能用才设计的token么。设计token的主要目的是不希望长时间保存用户的用户名和密码,所以转换成一个随机码,这个码本身就有替代用户名和密码的作用,所以一旦泄露了,自然后果也会很严重。&br&现在安全级别高的API一般都只能使用HTTPS,这样从路径中间是抓不到连接上的内容的,所以不会把token}

我要回帖

更多关于 任务栏在上面 的文章

更多推荐

版权声明:文章内容来源于网络,版权归原作者所有,如有侵权请点击这里与我们联系,我们将及时删除。

点击添加站长微信