easyui jquery 冲突与 jsplumb冲突吗

& jsPlumb开发入门教程(实现html5拖拽连线)
jsPlumb开发入门教程(实现html5拖拽连线)
jsPlumb是一个强大的JavaScript连线库,它可以将html中的元素用箭头、曲线、直线等连接起来,适用于开发Web上的图表、建模工具等。它同时支持jQuery+jQuery UI、MooTools和YUI3这三个JavaScript框架,十分强大。大家可以在官网的Demo中看看它的功能。目前可用的jsPlumb中文资料很少,希望这篇教程可以帮助大家更快的了解jsPlumb。出于篇幅考虑,本教程将以jQuery为例介绍jsPlumb。
浏览器兼容性
在使用jsPlumb之前,大家需要先了解一下各浏览器对jsPlumb的兼容性。jsPlumb支持IE6以上以及各大浏览器,但是仍然有一些bug:
在IE9上,由于jQuery1.6.x和1.7.x的SVG相关实现有一个bug,会导致鼠标停留事件无法响应
Safari5.1上有一个SVG的bug,会导致鼠标事件无法通过SVG元素的透明区域传递
在Firefox11上基于MooTools使用SVG时会出现一些问题
下载和引入
jsPlumb的源码和Demo可以在GitHub上下载,不想下载整个工程的可以直接从这里下载1.4.0版本。
在引入jsPlumb的同时,还需要引入jQuery和jQuery UI。需要说明的是,jsPlumb只兼容jQuery1.3.x及以上版本,并在jQuery UI 1.7.x、1.8.x及1.9.x上测试通过。另外,如果你使用1.7.x、1.8.x的jQuery UI,还需要额外引入jQuery UI Touch Punch。
[javascript][/javascript] view plaincopy
&script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.8.1/jquery.min.js"&&/script&
&script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jqueryui/1.8.23/jquery-ui.min.js"&&/script&
&script type="text/javascript" src="PATH_TO/jquery.jsPlumb-1.4.0-all-min.js "&&/script&
jsPlumb只有等到DOM初始化完成之后才能使用,因此我们在以下代码中调用jsPlumb方法
[javascript][/javascript] view plaincopy
jsPlumb.ready(function() {
// some code
首先,我们给jsPlumb设一些默认值,然后声明一个exampleDropOptions变量。
[plain][/plain] view plaincopy
jsPlumb.importDefaults({
DragOptions : { cursor: 'pointer'}, //拖动时鼠标停留在该元素上显示指针,通过css控制
PaintStyle : { strokeStyle:'#666' },//元素的默认颜色
EndpointStyle : { width:20, height:16, strokeStyle:'#666' },//连接点的默认颜色
Endpoint : "Rectangle",//连接点的默认形状
Anchors : ["TopCenter"]//连接点的默认位置
var exampleDropOptions = {
hoverClass:"dropHover",//释放时指定鼠标停留在该元素上使用的css class
activeClass:"dragActive"//可拖动到的元素使用的css class
添加jsPlumb连接点
然后声明两种类型的连接点。
[javascript][/javascript] view plaincopy
var color1 = "#316b31";
var exampleEndpoint1 = {
endpoint:["Dot", { radius:11 }],//设置连接点的形状为圆形
paintStyle:{ fillStyle:color1 },//设置连接点的颜色
isSource:true,
//是否可以拖动(作为连线起点)
scope:"green dot",//连接点的标识符,只有标识符相同的连接点才能连接
connectorStyle:{ strokeStyle:color1, lineWidth:6 },//连线颜色、粗细
connector: ["Bezier", { curviness:63 } ],//设置连线为贝塞尔曲线
maxConnections:1,//设置连接点最多可以连接几条线
isTarget:true,
//是否可以放置(作为连线终点)
dropOptions : exampleDropOptions//设置放置相关的css
var color2 = "rgba(229,219,61,0.5)";
var exampleEndpoint2 = {
endpoint:"Rectangle",
//设置连接点的形状为矩形
anchor:"BottomLeft",
//设置连接点的位置,左下角
paintStyle:{ fillStyle:color2, opacity:0.5 },
//设置连接点的颜色、透明度
isSource:true,
scope:'yellow dot', //同上
connectorStyle:{ strokeStyle:color2, lineWidth:4},//同上
connector : "Straight", //设置连线为直线
isTarget:true,
maxConnections:3,//同上
dropOptions : exampleDropOptions,//同上
beforeDetach:function(conn) {
//绑定一个函数,在连线前弹出确认框
return confirm("Detach connection?");
onMaxConnections:function(info) {//绑定一个函数,当到达最大连接个数时弹出提示框
alert("Cannot drop connection " + info.connection.id + " : maxConnections has been reached on Endpoint " + info.endpoint.id);
将连接点绑定到html元素上
[javascript][/javascript] view plaincopy
var anchors = [[1, 0.2, 1, 0], [0.8, 1, 0, 1], [0, 0.8, -1, 0], [0.2, 0, 0, -1] ],
maxConnectionsCallback = function(info) {
alert("Cannot drop connection " + info.connection.id + " : maxConnections has been reached on Endpoint " + info.endpoint.id);
var e1 = jsPlumb.addEndpoint("state2", { anchor:"LeftMiddle" }, exampleEndpoint1);//将exampleEndpoint1类型的点绑定到id为state2的元素上
e1.bind("maxConnections", maxConnectionsCallback);//也可以在加到元素上之后绑定函数
jsPlumb.addEndpoint("state1", exampleEndpoint1);//将exampleEndpoint1类型的点绑定到id为state1的元素上
jsPlumb.addEndpoint("state3", exampleEndpoint2);//将exampleEndpoint2类型的点绑定到id为state3的元素上
jsPlumb.addEndpoint("state1", {anchor:anchors}, exampleEndpoint2);//将exampleEndpoint2类型的点绑定到id为state1的元素上,指定活动连接点
需要注意的是连接点分为动态连接点和静态连接点。当指定一个数组作为连接点时,该连接点为动态连接点,连线时会自动选择最近的连接点连接;当指定一个坐标或者固定位置(TopRight、RightMiddle等)作为连接点时,该连接点为静态连接点,不管怎么连线都不会移动。具体可参见官方文档。
Html和CSS代码
[html][/html] view plaincopy
&div id="state1" class="item"&&/div&
&div id="state2" class="item"&&/div&
&div id="state3" class="item"&&/div&
html部分仅声明三个div,注意,jsPlumb通过id来识别html元素,因此如果要使用jsPlumb连线必须声明id。
[css][/css] view plaincopy
&style type="text/css"&
.dragActive { border:2 }
//当拖动一个连接点时,可连接的连接点会自动使用该css
.dropHover { border:1 }
//当拖动一个连接点到可连接的点时,该点会自动使用该css
background-color: #
width: 100
height: 100
到此我们就完成了一个简单的jsPlumb连线示例,大家可以在浏览器中运行一下看看效果。
进一步学习
本文中的例子参考了Emiel的教程Getting started with jsPlumb以及官方Demo DraggableConnections,大家也可以看一看。
由于篇幅限制,本文并未对jsPlumb的所有特性及功能进行说明,大家可以通过官网进行更深入的学习。不过个人认为官方文档比较难读,建议大家可以结合官网的Demo学习,Demo源码可以在GitHub上下载到。
Demo:http://jsplumbtoolkit.com/jquery/demo.html
官方文档:http://jsplumbtoolkit.com/doc/home
API:http://jsplumbtoolkit.com/apidocs/files/jsPlumb-1.4.1-apidoc.html
第一次发教程,如果对大家有用的话,还希望能留言支持一下。有任何问题也欢迎大家一块交流探讨。
转载请注明出处,谢谢!
本文固定链接:
[上一篇][下一篇]
最新文章随机精彩热门排行
精彩内容获取超时,请稍候...
日志总数:3906 篇
评论总数:146 评
标签数量:4476 个
链接总数:4 条
建站日期:
运行天数:1822 天jsPlumb not working after loading jQuery EasyUI (conflict between jQueryUI and EasyUI) - Stack Overflow
Join Stack Overflow to learn, share knowledge, and build your career.
or sign in with
I come to a real problem with jsPlumb and jQuery EasyUI.
Here what happens. I've got a website which is using EasyUI, I was trying to add some jsPlumb connections to it and this connections are not behaving as I want them to do.
I prepare some demo with jsPlumb only and it's working. But when I add it to existing site connections are not working.
After investigating where is the conflict I came to this:
jsPlumb is working as long as I don't load EasyUI. After loading EasyUI:
&script type="text/javascript" src="jquery.easyui.min.js"&&/script&
I cannot create new connections by dragging source Endpoint (I drag endpoint but now new connection is created)
Endpoints don't follow div they belong to (they position themselves right when you point them by mouse)
I prepare 2 demos using jsFiddle to show what I mean. The ONLY think that differs this two samles is external resource of jquery.easyui
Working jsPlumb
Not Working:
How I can solve this? Maybe some of You know where is
conflict. My site is to advaced to change EasyUI to anything else at this point, and I really want to use jsPlumb as I cannot find any as powerful toolkit as this.
As was suggested I've tried to override draggable and it partially works but not as I want it to.
(function($){
var __old_draggable = $.fn.
$.fn.draggable = function(){
if(this.hasClass('_jsPlumb_endpoint') || this.hasClass('window')){
return __old_draggable.apply(this, arguments);
$.extend($.fn.draggable,__old_draggable);
})(jQuery);
Partially because:
Endpoints are no longer draggable - that's good,
But they do NOT create new connection, that's bad.
They do NOT follow window they are attached to.
Furthermore now I now, that EasyUI override jQuery draggable and this cause this strange (for me) behaviour. Sadly, I have problems with force draggable method to be original jQuery method... So I'm looking forward to other solutions for my problem
Edit: I delete everything connected with draggable and dropable override in EasyUI and its working. Now the problem is how to do this in program not in easyui script, as somebody will update to new version and everything will stop working...
OK. Here is the best working solution for me.
What happened here is that EasyUI draggable, dropable methods overrides jQueryUI methods. jsPlumb is designed to work with jQueryUI and not with EasyUI as it turned out.
I've found that EasyUI framework provide method
which allow you to choose which modules to load! GREAT! I have to change my code a little removing everything connected with EasyUI from *.html documents (as no EasyUI modules are known on creating DOM).
In HTML document place
&script type="text/javascript" src="jquery-easyui-1.3.2/easyloader.js"&&/script&
instead of
&script type="text/javascript" src="jquery-easyui-1.3.2/jquery.easyui.min.js"&&/script&
then in your javascript I've done something like this:
$(function () {
using(['panel', 'datetimebox', 'tabs',
'accordion', 'layout', 'linkbutton',
'datagrid'], function(){
initAll.call(this);
where using is an
function which load modules. initAll is my function where all initialization of EasyUI components are for example:
var initAll = function () {
//initialize all html elements which uses EasyUI
$('#layout').layout({fit:true});
$('#tabscontainer').tabs({fit:true, border:false});
$('#accordion').accordion({fit: true, border:false});
The only thing you MUST be aware of is that you CANNOT use modules which are dependant of draggable/droppable modules to have jsPlumb working. Dependencies can be found in easyloader.js source code. For EasyUI version 1.3.2 they are: slider, tree, window, combotree, dialog, messager.
2,30661726
There is a fork of jsPlumb dealing with EasyUI. Maybe that helps?
Demo sources are available at
7,742352105
Override EasyUI $.fn.draggable. Try detecting jsPlumb classes on this and prevent from manipulating jsPlumb nodes.
Something like if(....hasClass('jsPlumbClass') )
might help. The libraries are in conflict somehow, sorry about that.
user2050799
Your Answer
Sign up or
Sign up using Google
Sign up using Facebook
Post as a guest
Post as a guest
By posting your answer, you agree to the
Not the answer you're looking for?
Browse other questions tagged
Stack Overflow works best with JavaScript enabledMiscellaneous Links
Miscellaneous Links
Here are a few miscellaneous interesting links.
I don't agree with everything in every link, but
I often learn about interesting pages, and this page links
to those other interesting pages.
Far too many programs have a horre I'd like
to see more programs pay more attention to usability.
This is a problem for both proprietary and OSS/FS programs.
from PublicDomain.org argues that
"100 years ago we were smarter about copyright, about disruptive technologies,
about intellectual property, monopolies and network effects than we are today".
One small ray of hope is the
companies had been trying to get separate attack dogs
to sue those who used material under fair use,
while keeping themselves out of the justice system.
The Righthaven ruling shows that if you have a problem, you have to
show up in court and justify your claims yourself.
But that doesn't deal with the fundamental problems.
a very nasty flaw in Microsoft Windows has Microsoft suggesting not to
trust Microsoft.
Also, it notes that
"Microsoft revealed for the first time that desktop Windows makes a profit margin of more than 85 percent. To put this in personal terms, for every dollar you spent licensing the OS last year, Microsoft spent less than 15 cents on all Windows packaging, marketing, and, oh yeah, improving the product."
There are some interesting pages available on cross-platform GUI toolkits,
One thing that everyone agreed on was that you should look at
if you're doing it.
should be examined too.
If your GUI needs are very simple (e.g., you don't need full
event-driven development), there are some nice toolkits that can make it easy.
Zenity (for bash) and
(for Python, see ) are two approaches.
For more sophisticated needs,
glade (possibly combined with
) can help.
Programming cross-platform GUI applications, and the simple "dialog"
options don't work?
There are many options, which can be grouped on the basic toolkit or language.
Many people are moving away from using GUI builder
instead, people use GUI builders to build data structures and call-outs,
and then create a very small program that loads the GUI builder's
data structures (this simplifies changing things).
Here's some info I found:
This is the cross-platform GUI library I hear the most about.
It's implemented in C++, but the
lets you use Python (which is much simpler).
For wxWidgets form-building, there's wxFormBuilder and wxGlade.
If you want a full IDE that supports wxPython,
PythonCard and spe do that (and there are yummable Fedora packages).
PythonCard says that it is "for you if you want to develop
graphical applications quickly and easily with a minimum of effort and coding."
There's also
which supports wxPython, but no Fedora package.
GTK+: In the GTK+ world,
explains that libglade is getting
replaced by gtkbuilder.
Basically, libglade's dynamic loading capability is getting moved into GTK+
itself as , and some cleanup was done for the transition.
Glade will generate gtkbuilder XML directly, but in the meantime, you need
to run a converter program (not a big deal).
Java: Java has some extra decisions.
For Java-native interfaces
you can choose Swing (reference GUI for J2SE)
or Standard Widget Toolkit (SWT), developed by IBM as part of Eclipse.
There's also the older AWT.
SWT tries to be close to Swing tries to abstract
away from it.
AWT it's a simple toolkit with limited capabilities, but
it does have the advantage of stability.
Qt: "Qt Jambi" is an interface to Qt for Java.
Qt has had licensing issues in the far past, but it's now released under
the LGPL which I think should be great for everyone.
Qt is implemented in C++ with several non-standard extensions,
an implementation approach I don't like, but there are
certainly many happy developers.
is developing an
entire open content Encyclopedia (and a related dictionary, too)
by intentionally working to form a community to build it.
This is a very intriguing project.
Critical decisions that have enabled them to form this community are the
Wiki approach (where anyone can edit anything), a
and the GNU Free Documentation License (GFDL) which ensures that
the resulting text is available for any purpose in perpetuity.
is an excellent piece about
how groups are different than individuals, and what software that supports
groups needs to consider.
references it.
Interesting paper:
By Douglas Clement
IPv4 is running out of address space -
- I want a pair!
For an up-to-date high-level view of attacks and vulnerabilities,
you might want to look at
RV10 is a dynamic list of the ten most critical
and prevalent security vulnerabilities,
updated automatically
and continuously from a sample of a few thousand networks.
tracks which ports are most attacked, and divides attacks by
geographic regions.
Some projects appear to be impossible, such as
solving "NP-complete" problems for a large number of items ("large n").
In contrast, some projects are possible - but unaffordable.
Since the late 1980s, I and some co-workers have had a phrase for
unaffordable projects: "GNP-complete" problems.
They're solvable, but they require a country's entire
Gross National Product to solve.
Thankfully, many GNP-complete problems can be reduced or simplified
so they become affordable, and there's always hope for a breakthrough.
An excellent way to take over a democracy is take control of
its voting system.
Stuffing ballot boxes isn't new, but now we have a high-tech way to
control every ballot box in a country: electronic voting machines.
Stuffing physical ballot boxes requires a lot of dangerous work and
is hard changing an electronic value to a "desirable"
value can be done by one person in microseconds.
And given some of today's unverifiable
electronic voting systems, it's impossible
to detect that someone has stolen the elections.
I'm very concerned about unverifiable
electronic voting systems, especially since the
manufacturer's leaders appear quite partisan.
MicroVote system.
They say they found the new numbers - but why are those trustworthy?
Independent analysis of Diebold found numerous problems, and internal
memos had a number of scary statements.
documents many of the concerns.
These unverifiable systems are also called
"Direct-Recording Electronic (DRE) systems", because they record vote results
directly into an electronic system (with no possibility of independent
verification or real trustworthiness).
Ariel J. Feldman, J. Alex Halderman, and Edward W. Felten
did a fully independent security study of a Diebold AccuVote-TS voting
machine, and proved that it is
very vulnerable to extremely serious attacks.
, ones that Michael
Shamos (a computer scientist and voting system examiner in
Pennsylvania) described as "the most severe security flaw ever
discovered in a voting system."
Diebold included a "back door" in its software, allowing
anyone to change or modify the software, and there are no technical
safeguards in place to ensure that only authorized people can make changes.
A malicious individual with access to a voting machine could rig the
software without being detected.
found that
anyone given brief access to the machine can gain
complete and virtually undetectable control over
election results - and how radio emanations from
an unmodified ES3B can tell who voted what from
several meters away.
() is a documentary that helps
explain the issue to the non-technical.
For a silly view,
There's a solution, and that's verified voting - see the
The Verified Voting Foundation advocates
the use of voter-verified paper ballots (VVPBs) for all elections
(so voters can inspect individual permanent records of their ballots
before they are cast and so meaningful recounts may be conducted),
insists that electronic voting equipment and software be open
to public scrutiny, and that random, surprise recounts
be conducted on a regular basis to audit election equipment.
I would add three things:
(1) there must be separate voting stations and ballot readers,
where the ballot reader totals are the only official votes
(this prevents a collusion by the voting station), and
(2) there should a standard this makes it possible
to have independent recounts using equipment from different manufacturers, as
well as making it possible to mix-and-match vendor equipment
(lowering costs for everyone);
(3) there should a standard electronic formats for
defining elections and producing results, again to make it
possible to dramatically reduce costs by enabling mixing and
matching of equipment.
(OVC) is a non-profit organization dedicated to the development, maintenance,
and delivery of open voting systems for use in public elections.
OVC is developing a reference version of free voting software
to run on very inexpensive PC hardware, which produces
voter-verifiable paper ballots.
Another relevant system is
; they maintain
an OSS/FS program that's
, and they plan to add
a voter verified receipt (a critical need).
Another interesting article is
(Wired, Oct 18, 2006) has some
great suggestions.
notes that
OSS voting systems is no panacea - which is absolutely true, but that
doesn't mean it's not worth considering.
The current shameful system -
where counting is done by unaccountable, unreviewable machines -
is the kind of system that Stalin would have created.
It's pretty scary that the U.S. protects voting for academy award winners
more than voting for U.S. president.
explains how the academy awards counts votes.
Their system is
"designed to make sure each Academy member’s vote is accurately represented".
In particular,
"It is totally analog, and will remain so, in part because the Academy
believes that anything that is in a computer will eventually be hacked."
What, exactly, does that say about the country? Why do we protect the selection
of Oscar winners more than presidential winners?
, an anonymous reader
observed that, even though Diebold had horrifically bad security,
there are financial and political incentives for it.
"Unfortunately, you're not Diebold's customer. The elected officials who
in turn buy the machines responsible for reelecting themselves
are Diebold's customers." (Anonymous Coward,
"Re:My Perception Has Changed Again", September 5, PM).
One reply was
"It's kind of like television. You are not the networks' customer.
The ad compan you are the product that is sold to them.
Everything else is just flim-flam designed to keep you in front of the tube."
(Grendel Drago, "It's like television.", September 5, PM).
reports that Virginia and Maryland are switching back to paper.
The counts will still be done electronically, but the voters will
get to use paper directly.... which eliminates many (though not all)
of the risks of computerized voting.
This is good news, especially if they standardize the paper so that
you can recount with independently-developed systems.
The shame is that these states were fooled into buying voting machines that
weren't adequately secur in my mind, the states should
get their money back.
is an excellent summary on cybersecurity of voting machines.
He made three key points (and in the details he noted that they have to
be secure against nation-states, not just criminals):
Paperless DRE voting machines should be immediately phased out
from US elections in favor of systems, such as precinct-counted
optical scan ballots, that leave a direct artifact of the voter’s choice.
Statistical “risk limiting audits” should be used after every election
to detect software failures and attacks.
Additional resources, infrastructure, and training should be made
available to state and local voting officials to help them more
effectively defend their systems against increasingly sophisticated
adversaries.
There are lots of good search engines available, including
(supported by the U.S. Institute of Museum and Library Services
under the provisions of the Library Services and Technology Act,
administered in California by the State Librarian).
, making it a poor choice for most searches.
project is interesting but
not useful as of November 2003.
is an interesting way to get user feedback.
A very nice summary of how the law works was
"The trial tries the facts. The appeal tries the trial.
The Supreme Court tries the law."
is an interesting article about meetings.
reports that coin-tossing is slightly
biased to whatever the coin started at, based on a study by
by Persi Diaconis, Susan Holmes
(statisticians at Stanford University),
and Richard Montgomery of the University of California, Santa Cruz.
If you're creating a program and it needs to install on
Microsoft Windows, you need an installer.
are OSS/FS installers for Windows.
is available on SourceForge
(though it may not support as many Windows versions).
(There's also the proprietary InstallShield, but it appears to me
that most users don't care which installer is used - they just
want things installed).
I've often looked for statistics on computer security
(what vulnerabilities are common, etc.):
@stake published its first application security metrics report in April
2002, analyzing 45 "e-business" applications that @stake
assessed for its clients.
Most are web applications.
@stake found that 70% of the defects analyzed were design flaws that
could have been found using threat modelling and secure design reviews
before the implementation stage of development.
62% of the apps allowed access controls to be bypassed,
27% had no prevention of brute force attacks against passwords, and
71% had poor input validation.
A July 2003 follow-up is in
IEEE Security and Privacy Magazine published another article
with some of that material in
OWASP has a top 10 list.
It was developed by experts instead of by
looking at application statistics, but Chris Wysopal says it's largely
the same as the @stake list.
A very interesting and innovative approach to user interfaces is
Users pick keywords, files, programs, etc. to quickly narrow down
the options to tell the computer what to do.
Very different, yet compatible with current technology.
Interested in Koine Greek? Some interesting information can be found in
, part of the
Make sure you're aware of the
Laugh, it's funny.
Unisys had been threatening many people if they used GIF
(although as far as I can tell, their patents only applied to
writing, not reading, the compressed format often used in GIF).
According to
who searched the patent databases of the USA, Canada, Japan,
and the European Union, here are the relevant dates.
The Unisys patent expired on 20 June 2003 in the USA, but it
does not expire in most of Europe until 18 June 2004,
in Japan until 20 June 2004 and in Canada until 7 July 2004.
The U.S. IBM patent expired 11 August 2006.
The fact that two companies (IBM and Unisys) have been
allowed to have two separate patents on the same algorithm
clearly demonstrates
how poorly patents are examined - the patent office couldn't even
be bothered to search their own patent database for
previously-granted patents.
The IBM patent should have been tossed out immediately, since a
previous patent already covered it.
The patent system desparately needs an overhaul, and the best start
would be to eliminate software patents and bus
the problems this governmental interference is causing far exceeds its
supposed benefits.
describes many of the serious problems patents cause in standards,
and standards are absolutely critical to working IT infrastructures.
does a nice job explaining why software patents are economically
a bad idea.
Burnette concludes that "The only solution is to ban software patents
altogether, worldwide. Copyright law provides plenty of protection
for software, just as it does for paintings, poetry, and books."
The patent office doesn't do a credible job evaluating for novelty
and prior art, but even if they did, the problems caused by software patents
far exceed the (supposed) benefits of the system.
briefly explains why software patents should be prohibited.
Software is already protected by copyright law, which is a system
much more appropriate to software.
who after analyzing a massive amount of data found that patents don't
work except in biotech... and that they especially don't work in the
information technology industry.
There's no need to have both copyright and patent law control
software, especially since there's lots of evidence that patents are
impeding instead of aiding software innovation.
goes futher,
makes a strong case for abolishing patents and copyrights entirely (their
They have lots of useful evidence about the failures of software patents.
(published in the Journal of Science & Technology Law)
argues agai it "recommends a return to the distinction that inventions consisting of information processing plus a trivial physical step be barred from patentability."
Section I provides in it, he explains that
"it is impossible to write a section of the Manual of Patent Examination
Procedure (MPEP) that allows the patenting of software but excludes from
patentability the evaluation of purely mathematical algorithms. The proof of
this is in the formal Church-Turing thesis (that software and mathematical
algorithms are in the same equivalence class) and Knuth’s comment that all
demonstrations of this weaker Church-Turing thesis will
appear repeatedly below. In short, once one type of information processing is
patentable, all types are patentable. Because there are various types of
information processing that many think should not be patentable, the
patentability of any one type of pure information processing creates myriad
problems."
Section II provides an
"why allowing software and business methods to be patentable creates
transaction costs that easily dwarf the benefits that such patent protection
may provide. The key concept behind the discussion is that these
pseudo-industries are massively decentralized, and patents do
not efficiently promote progress in a decentralized industry.
Unlike copyright, independent invention is not a valid
defense against claims of patent infringement.
If there are millions of potential independent inventors,
then the waste and economic loss associated with
restrictions on independent inventors becomes inevitable."
"adding up the settlements we find a variety of companies, some in
traditional software and many elsewhere, paying billions of dollars for
the right to use software they conceived and wrote without
outside assistance—and those are just the headlines."
Although many software patents are also obvious,
"Fixing the obviousness problem would do nothing, however,
to alleviate the problems with applying patents to a massive industry."
Instead, "A great many of the problems with patents that fill the newspapers
and vex businessmen can be solved by reinstating the distinction from Diehr
and its predecessors that indicate a device is patentable only if it is based on
steps that are simultaneously novel and non-trivially physical...
There is a history of court rulings stating that pure
information processing is not patentable, even when a patent draftsman adds
'insignificant postsolution activity' to apply the information to real-world
affairs. Thus, this judicial line distinguishing the patentable from the
unpatentable exactly matches the ideal economic line that divides traditional
industries that prospered with patents from the massively decentralized
information-based industries that have prospered without patents."
The article
by Florence Olsen (FCW.com, Jul. 25, 2005) notes that there's an
exodus reasons cited include having adequate
time to review patents and no training to perform the task,
coupled with a crushing burden of patent applications
that are increasing in both size and number.
The article based much of its information on a GAO study reporting
these many problems, and it also quoted
Jason Schultz, a staff attorney at the Electronic Frontier Foundation, who
says that under the current rules,
"where anything under the sun is patentable, it puts an
unbelievable amount of pressure on the patent office."
what a broken system!
organization that is trying to eliminate the nonsense of software patents.
is trying to "overcome the software patent crisis... We raise awareness about their devastating effects on the emerging information and knowlege society where software predominates and we make our constructive reform proposals heard."
a "non-profit organisation dedicated to establishing a free market
in information technology, by the removal of barriers to competition.
The FFII was largely responsible for the rejection of the
EU software patent directive in July 2005."
a number of research papers, several of which quantitatively
show the problems of software patents.
The ACLU objected that patents must not trump the first amendment.
proposed a method to determining patentability.
made it clear that, although drug patents only affect
drug companies, every organization is an information processing
company (and thus vulnerable to software patents).
here is the
The article
had interesting quotes from the U.S. patent office, as well as interesting
commentary about them.
For example,
Mr. Tariq Hafiz, a patent examiner, explained an examiner's day-to-day
life, and an attendee noted that,
"One aspect of the life of a patent examiner that came into sharp relief
through all of this was the extreme premium placed on time.
The USPTO has a huge backlog of pending applications and limited resources,
so the amount of time an
examiner spends on each application is carefully tracked, and measures
(described by one participant as possibly .punitive.)
taken against those examiners who don't live up to the norm."
One commentor candidately
summarized the situation this way:
"The clear implication of the statement above is that John Doll
is grossly wrong
when he states that patent quality is the number one focus of the USPTO: the
number one and only focus of the USPTO is the amount of time that each patent
examiner spends on a patent. What the USPTO says and what it does are two
different things: in everyday English, it's called lying.
The USPTO is nothing but a corrupt rubber stamp operation...."
notes that patents in the medical field
are making a few rich, but inhibiting innovation in the process:
"The problem is, once it became clear that individuals could own little parcels of biology or chemistry, the common domain of scientific exchange--that dynamic place where theories are introduced, then challenged, and ultimately improved--begins to shrink. What's more, as the number of claims grows, so do the overlapping claims and legal challenges. This isn't merely a hypothetical situation, a 'worst-case scenario' painted by academic hand-wringers. It has already happened, as two professors at the University of Michigan Law School, Michael Heller and Rebecca Eisenberg, observed in a seminal 1998 article in Science magazine...
Heller and Eisenberg dubbed this new dismal state of affairs the 'Tragedy of the Anticommons.' And that's what it is--a tragedy that's still in the making."
They then note, "it's clear who pays for it. You do. You pay in the form of vastly higher drug prices and health-care insurance. Americans spent $179 billion on prescription drugs in 2003. That's up from ... wait for it ... $12 billion in 1980."
Software patents they do far more
harm than good to the software industry.
But it may be a very long time before they go away - patent lawyers
in particular make a pile of money from them, and they make the rules.
In the meantime, some people are working within the (broken) legal
framework to reduce the unnecessary and serious damage that
software patents cause.
Microsoft is making lots of patent threats... yet won't actually say what
the patents are, and seems to be conceding that it won't actually sue anyone
(what's a threat when you admit you won't actually do anything?).
Microsoft is pl organizations like the
aids open source software against
software patent attacks.
Their mission is to "further software innovation by acquiring patents
to be used for cross-licensing purposes to defend the Linux
environment - making them available on a royalty-free basis
[to those programs]."
In short, if Microsoft strikes, they risk a massive patent counterattack,
in which their key products (Windows and Office) will suddenly have a wave
of patent lawsuits.
Lots of individual companies (esp. IBM) have relevant patents too.
documents commitments by various patent-holders to not sue under
certain conditions (typically not attacking open source software), which
makes it easy for FLOSS developers but dangerous for others.
In the U.S., a
implies that many more software patents
are invalid anyway.
campaign has a massive number
of enrollees.
FSF legal counsel
(here's ).
and with that sort of threatened pyrotecnic, it's unlikely Microsoft
will persue this seriously.
has an interesting quote:
"The tools you rely on to run your business -- being able to fix them when they break -- good idea."
(Suggesting that Open Source software is
critical for the security of one's company.)
are especially noteworthy.
is an interesting
viewpoint from a proprietary software vendor's.
To oversimplify, it comes down to "do it better", which is not a bad
idea for any software developer.
Indeed, all of the actions suggested can also be performed by OSS
developers.
For a sobering perpective, look at
(described in
which calculates what will happen for a given asteroid impact.
If you're looking for security tools,
is a bootable Linux CD with a pile of security tools pre-installed.
No need to touch the hard drive at all -- just boot and run off the CD.
And it even includes !
Another bootable Linux distribution with security tools is
LAS fits on a mini-CD; Knoppix STD has more stuff but requires a regular CD.
Even if you don't want that sort of tool, they're interesting because
they provide an easy way to find a list of OSS/FS security tools.
maintains a list of papers on security, grouped by category.
Open standards are critical, but years ago NIST decided that they
wouldn't support standards testing any more, and customers typically can't
afford to do it themselves.
As a result, standards languish untested, and vendor products are often
gross in their failure to interoperate.
Interesting essays on this (specific to SQL) are available by
Thankfully, there are some test tools for standards.
In particular, there's just no excuse for invalid HTML or CSS, since
there are tools that check them that don't even require installing anything.
In particular, sticking with valid HTML is essentially required for
many in the world have handicaps, and it's unfair to
prevent their access by failing to follow standards.
in particular to validate the HTML of my paper
though other services like Bobby exist.
The W3C also has a nice
notes that
"When the question of standards is raised in China, officials and companies are quick to focus on one issue: intellectualy property rights...
China's drive to develop its own technology standards (open and closed) is directly linked to its intent to avoid IPR owned by foreign companies...
China does not want its innovation, its industrial development beholden to others. And does not want to spend the next 20 years watching royalties and license fees flow overseas. Even pledges not to sue are unacceptable...
Hence, its move to open standards, and in particular standards without any IPR... at least one official... maintained that a standard is not "open" if it has any IPR in its specification."
Christians are sometimes blamed for destroying much of the literature
of the ancient world.
(and indeed, many worked hard to retain ancient literature).
In particular, though Carl Sagan and Gibbon want to blame Christians for
the destruction of the ancient Alexandria library,
If computing disappeared tomorrow, I'd probably get involved
in biogenetics or law, two fields that in many ways similar to
software development.
Several folks have noticed how similar law is to software development, and
frankly I find that a little fascinating.
James Grimmelmann has a very good series of articles,
"Law School in a Nutshell"; see
this suite uses "Eldred v. Ashcroft" as a way to explain some legal writing.
is always interesting, and
has references to more about the law.
An interesting summary of what copyright does not cover in software,
as determined in various court cases, is given in
(appendix A of IBM's "Sur-Reply Memorandum in Further Opposition to
SCO's Objections to the Magistrate Judge's Order on IBM's Motion to Confine and SCO's Motion to Amend its December 2005 Submission.")
For more on copyright, see the
It makes no sense when people volunteer their work to others, but those
others get exclusive rights to exploit the work.
Standards have had this problem for some time - ISO
shamelessly exploits the volunteers
who write the standards, then turns around and charges exorbitant
fees for the stuff it didn't write.
ISO actually manages to discourage the use of standards with
its foolish policies, which is why
the IETF, W3C, and other organizations which freely distribute standards
have become increasingly relevant (and ISO increasingly less relevant).
The same issue is happening in science.
Open Access approaches allow a scientist to publish in a way
that's viewable to all, instead of enriching a single company who
doesn't even pay the authors.
The famous mathematical problem the
is an interesting anti-DRM tale for children (!).
Here are some interesting articles on
(you give up your legal rights when you buy DRM'ed products).
is very nicely formatted).
Another interesting DRM critique is
is also interesting.
is a short, accessible article.
shows the rediculous
costs and impositions created by attempting to make DRM work,
is an interesting non-technical discussion, and
is a fun explanation of what Macrovision really said.
attempts to hide the keys for DRM quickly collapsed,
and attempts to censor it failed as well.
After all, there are many legitimate reasons to have the key
(all users need the key so they can play the movies
on most equipment, or to legally back up the fragile HD DVDs;
Linux users need the key simply to
play the HD DVDs that they legally bought).
This means that no more DRM audio CD's will be released.
For years, manufacturers tried to impose DRM on customers, but with
little success.
DVDs are already going the same direction - they had built-in DRM,
with codes that were supposed to prevent people from seeing DVDs bought
in other countries... but now DVD copies are available everywhere, and
DVD players that ignore the codes are commonplace.
so that customers will finally be able to legally buy music with all
the rights they are entitled to under the law.
By one source,
Note that .
Customers want products that serve them and their needs.
answers the simple question,
"What happens to the music you paid for if that company changes its mind?"
It's not pretty.
And it's why customers should reject DRM'ed material.
It's time for vendors to listen to their customers.
is one customer's
story explaining why DRM is a failure - DRM simply prevents legal uses,
ones that customers need, and forces customers to pirate their music.
All because the industry refuses to sell what customers actually want.
is an interesting page.
It notes that
"According to a 1991 eight-state study funded by the National Institute of Mental Health, the insanity defense was used in less than one percent of the cases in a representative sampling of cases before those states' county courts. The study showed that only 26 percent of those insanity pleas were argued successfully.
Most studies show that in approximately 80 percent of the cases where a defendant is acquitted on a "not guilty by reason of insanity" finding, it is because the prosecution and defense have agreed on the appropriateness of the plea before trial.
That agreement occurred because both the defense and prosecution agreed that the defendant was mentally ill and met the jurisdiction's test for insanity.
Clearly, the implication is that the insanity defense is rarely used successfully by malingerers.
Other studies over the past two decades report similar findings.
According to Myths and Realities: A Report of the National Commission on the Insanity Defense, in 1982 only 52 of 32,000 adult defendants represented by the Public Defender's office in New Jersey--less than two tenths of one percent--entered the insanity plea, and only 15 were successful.
A similar number of insanity defense pleadings--"far less than one percent"--were entered in Virginia during the same period.
A 2001 study in Manhattan (Kirschner and Galperin) noted that over a ten year period, psychiatric defenses were offered by only 16 out of every 10,000 indicted defendants.
More than 75% of the time that a psychiatric defense was successful, it was the result of the prosecutors' consent.
Out of nearly 100,000 felony indictments during that period, only 17 juries heard arguments concerning the insanity defense and their deliberations resulted in only 4 insanity acquittals.
These authors concluded, "if the prosecutor does not accept the defense, the judge or the jury is not very likely to accept it either."
There's a lot of stuff related to the old Apple ][ line.
is an amazingly good
emulator of an Apple //gs, and there are many other emulators.
Good sources of info include
Free (no-cost) software for the
lines is available (the list is long).
Heck, there's even an
Unfortunately, for the emulators to run real Apple ][ programs,
you need the Apple [] ROMs and operating systems.
At least some operating system
as a better replacement for DOS 3.3,
and an old ProDOS can be downloaded from Apple.
ROMs are easy if have an old Apple (millions do), but others have
a harder time.
Franklin Computer once sold Apple clones with ROMs, but it
(which was, not suprisingly, found to be illegal).
Perhaps someday someone will create an OSS project to recreate a
functional equivalent of the Apple ][ ROMs; even just the ROMs to
boot a disk and a computer would be enough for many binary programs.
Note: Applesoft BASIC was copyrighted by Microsoft.
Forth is an interesting old language, though not practical for most of
today's applications.
Implementations could be interactive, take very little memory
(a few K for good ones), could be fully understood, and be reasonably fast.
But the trade-offs are that programmers have to manage their own stack
(instead of letting the computer track it for them),
this is a trade-off that's hard to justify in most real applications
Still, it's fun to learn, and definitely expands the mind.
has a nice summary of it
(the books "Starting Forth" and "Thinking Forth" are good too).
There are many OO expansions available for F an amazingly short and
complete OO extension that works using just standard ANS Forth is called H
you can learn more at
There's a lesson here for future standards writeres.
Much of this was to try to generalize assumptions, which is fine.
However, many years ago there was a Forth-79, followed a few years
later by a FORTH-83 that had many improvements but "grave incompatibilities".
For example, in Forth-79, TRUE returned 1 and NOT inverted a boolean flag.
In Forth-83, "true" became all ones (-1 since two's complement is assumed)
and NOT became bitwise complement.
As in all languages, what matters is what the IF stat
in Forth-83, 0 is false, all else is true.
This was a serious screw- while the representation of TRUE
can be covered up remarkably often, NOT is extremely common, so
having a specification change the semantics of a common operation
created a big problem.
Later specs solved much of this by defining new names with rigorous
unique semantics.. then people could redefine the words to what they needed.
INVERT inverts all bits,
0= reports if a value is equal to 0 (returning TRUE if it's 0, FALSE if not),
NEGATE flips an integer's sign (so 3 becomes -3)...
and NOT isn't defined at all.
That was actually a good move, since while NOT was widely used, these
two conflicting and incompatible standards meant that there was no
actual agreement on the semantics of NOT.
You could define NOT the way you needed, or search-and-replace all instances
of NOT with the word you meant.
The newer spec
it still keeps the (arguably better) Forth-83 semantics of booleans values,
but in a way that made it much easier to port software to
(as well as making the intention clearer).
Similar things happened with the way t
they allowed some flexibility for the usual operation, and if you needed
a specific semantic, they provided those as separately-named words.
The original spec authors were too short-sighted and overspecified things
like word length (16 bits).
Later spec writers fixed this, but at least in this case they did it by
removing constraints (you don't have to use 16 bits) and by adding new
operations (e.g., CELL and CELLS) whose names would not conflict with
existing names (and thus didn't cause portability problems).
Spec writers would be wise to think about how hard it is for
language implementers and application writers to transition to the newer
spec... "small" things can be big.
(Python 3, I'm looking at you.)
Users of Unix-like systems usually need to type in their passwords
to log in, and most systems use PAM.
It'd be nice to protect data if it's on a laptop that could get stolen,
or a system that could be broken into while the user isn't logged in.
A nice solution would be to
create a PAM module that used the password entered as a key for
confidential data
(e.g., to decrypt a password or other keyring, or to access
an encrypted filesystem).
If the decrypted information and derived key were removed on log out,
someone who later stole the laptop would have to break the decryption.
This might be useful to add to the
for example.
It's hard keeping track of all the technology news sites.
Some show a set (merging using RSS or other techniques), such as
SCO has made a lot of claims without evidence, but one good thing
has come of it: .
Groklaw has demonstrated the extraordinary power of a place where
people knowledgeable in a wide range of areas like
information technology, law, and journalism can come together to
counter nonsense.
Groklaw finds interesting things posted by others, such as
Neil Wehneman's
SCO's claims have fallen flat.
Earlier on Darl claimed that
MIT "deep divers" had found lots of examples where Linux had copyrighted
code illegally copied from elsewhere - but when asked to provide evidence
to the court, he didn't.
Instead, we now have found out that SCO funded a 4-6 month investigation,
and their investigation exonerated Linux and other open source
software components.
In it, Davidson reported how the company had hired an outside consultant
because "of SCO's executive management refusing to believe that
it was possible for Linux and much of the GNU software to have come
into existance [sic] without *someone* *somewhere* having copied pieces
of proprietary UNIX source code to which SCO owned the copyright. The
hope was that we would find a "smoking gun" somwhere [sic] in code that
was being used by Red Hat and/or the other Linux companies that would
give us some leverage. (There was, at one stage, the idea that we would
sell licences to corporate customers who were using Linux as a kind of
"insurance policy" in case it turned out that they were using code which
infringed on our copyright)."
The consultant was to review the Linux code and compare it to Unix source
code, to find possible copyright infringement. Davidson himself said
that he had not expected to find anything significant based on his own
knowledge of the code and had voiced his opinion that it was "a waste
of time". After 4 to 6 months of consultant's work, Davidson says,
"we had found absolutely *nothing*. ie no evidence of any copyright
infringement whatsoever."
They had found some places where they were identical, but in all cases
that was because they had legally come from a common source
(such as X Windows).
The count of the electoral votes for U.S. president is tracked at the
(run by Andrew Tanenbaum); another one is the
It would be great to have cheap energy, especially for
transportation, so that the U.S. (and other countries) would be
energy-independent.
has a very interesting article on cellulosic ethanol and
ethanol reconstituters, two very promising technologies.
I'd love to see real research dollars spent specifically on promising
technologies, to get real solutions (soon!) to the current world
dependence on oil.
Looking for great screenshots (for backgrounds, etc.) that also encourage
interest in space?
A fun site is the
you can go there to see their archives.
Examples include
(of Stonehenge).
It's hard to point out highlights because there are so many interesting
If you're interested in trying out GNU/Linux, it's best to start with
one of the better-known distributions, such as Red Hat's, Novell/SuSE's,
Mandriva (formerly MandrakeSoft), Ubuntu, and/or Debian.
I use Red Hat Fedora Core myself, which works well.
But here are a few pointers:
If you choose to install Red Hat Fedora, you might find that you want
to install extras, proprietary add-ons, or change its configuration.
Sources include
; yum is Fedora's package manager,
so improvements in yum help system updates.
; at the least, do:
yum install yum-fastestmirror yum-presto
(yum-presto is automatically installed in Fedora 12).
You can upgrade Fedora systems in various ways.
As of Fedora 9, the new easy way is a network upgrade using
, as discussed in
You can invoke this through your updater.
Before using preupgrade, you should update all packages to the latest version.
Here's the old way, which is now obsolete:
You can do a network upgrade by creating a
"Rescue disk" (which is small), boot it, and update it that way
(this way you don't need to create big CDs with all the packages).
You need to tell it where to get the packages,
E.G., you might say "use HTTP", then "mirrors.kernel.org" for the site, and
then /fedora/releases/7/Fedora/i386/os for the name of download point.
Those with dynamic IP addresses can just use them.
I have statically-assigned IP addresses, so I had to specify them.
No big deal if you have a static IP address,
but be sure to know your IP address, gateway address, and
DNS you'll need to give your address in form A.B.C.D/X, where
X is the number of bits that are in 24 is a common
value for X.
From then on, you can use "yum update -y" as needed to get the latest updates.
Fedora 12 looks nice once installed, but I will say I've had more
problems installing it than most.
On a Dell Optiplex 620, I had to add a kernel boot-line
entry "iommu=soft" when running and rebooting else the disk would fail with
"kernel: mpage_da_map_blocks block allocation failed..."
I can do that, but that is a disaster for non-technical people.
Fedora 12's installer
because it switches the
"active" flag (this will be fixed in Fedora 13).
Similarly, if you choose Ubuntu, you should grab the
which has lots of great information.
GNOME uses a new "spatial mode" in its file viewer, which some people hate.
Th while viewing files, select Edit/Preferences,
view the "Behavior" tab, and select "Always open in browser windows".
If you have an ancient version of GNOME, or what to automate this selection,
you can use "gconf-editor" to do this.
In that case, you may need to install it
(in Fedora, install package "gconf-editor").
You can then run gconf-editor from the GUI by selecting the "Main Menu"
(foot or distribution symbol), then select
System Tools/ Configuration E
then turn on the checkbox for
/apps/nautilus/preferences/always_use_browser.
You can do this from the command line in one step by typing this command
as one line into a terminal:
gconftool-2 --type bool --set
/apps/nautilus/preferences/always_use_browser true
If you plan to install a dual-boot system with Linux and Windows XP
on the same system, and install a 2004-era Linux distribution,
there's really important information you need about
a bug you may encounter.
Some mid-2004 Linux releases based on Linux 2.6 (including
SUSE 9.1, and
have a bug that in rare cases causes Windows XP to not boot after
a Linux installation.
If this happens to you, don' instead,
that describes a workaround, and how to fix this if it happens
(the bottom line is a single command, "sfdisk -d /dev/hda | sfdisk --no-reread -H255 /dev/hda", change hda to whatever your boot drive is in the rare case where it's different).
For the few for whom that doesn't work, change the BIOS setting for the drive
from CHS or AUTO into LBA (and if that doesn't work, HUGE)
(this is per reports from F don't switch from HUGE to LBA though).
have more technical information.
This is a nasty problem, but it's easily fixed, so don't panic.
I expect the next releases will fix this problem so you don't have to do this,
but it's worth noting for the mid-2004 users.
Or just erase MS W that works too :-).
is an interesting video pressing
GNU/L it starts slow, but it's worth watching through.
are amusing too.
On Linux, if you can't get eject to work (e.g., "eject /dev/sr0" or
"eject cdrom" fails), try using SCSI directly (e.g., "eject -s /dev/sr0").
The phrase "Behold! For now I wear [the] human pants"
and variants seems to have caught the web by storm by
September 15, 2004.
As far as I can tell,
this phrase comes from the comic strip
(warning: language).
(It makes fun of those with
delusions of grandeur about something that is actually trivial.)
This is an amusing example of a meme spreading.
Of course, this is nowhere near as widespread as the silly phrase
"All your base are belong to us" which has its own
Studying the spread of silly memes might actually help us understand
the spread of important
gives some useful historical information.
The Linux kernel developers have a controversial policy: All kernel drivers
are part of the kernel, and the interface between drivers and the rest of
the kernel can be changed at any time
for more information).
Well, it's actually not controversial to the Lin
it's controversial to proprietary driver developers and
the microkernel community.
The Linux kernel developers don't want to be stuck with an unchangeable
it often needs to be changed.
The main reason to do otherwise would be to support proprietary drivers, but
proprietary drivers can't be fixed by the kernel developers, so for
reliability's sake it's better to inhibit their use.
Those who do the work should have the right to decide their rules, but there's
some evidence that this is actually sensible.
First of all, there's the evidence that a vast number of Windows crashes
are actually beca Linux reliability numbers
are far greater, suggesting that their approach is producing less buggy
(Microsoft has developed many tools that try to compensate.)
But here's another source suggesting this isn't insane... X Windows.
noted that the "policy of splitting the X drivers from the core server has not worked as well as they would have liked. It adds a whole set of API compatibility issues between the two, making it hard to develop and release improved versions of the server. Keith now thinks that the Linux kernel developers got it right by keeping drivers inside the kernel."
Now I do think that the microkernel folks do have a point -
there's no need for drivers to have unregulated control over everything.
But that doesn't requ a simple language that specified
"what access rights are needed", that was enforced when running a driver,
would suffice.
There are all sorts of interesting articles on lessons learned from
developing OSS/FS programs.
Linus Torvalds has posted his recommended
Under the "life is strange" category,
to go with the joke.
In October 2004, Michal Zalewski posted a pair of postings to Bugtraq
about web browsers:
He wrote a program (mangleme)
to generate random output, and found that web browsers
crashed quickly when given this random data.
Internet Explorer (IE) lasted a little longer, but not really very long, and
his test seems to have been unusually gentle to IE (it intentionally
avoided CSS, which happens to be one of the main problems in IE).
Thankfully, it appears that the Mozilla folks worked quickly to fix
the problem.
shows that the Mozilla developers
quickly found and fixed the problems, and one person said:
"I used the last three days and now tested several thousand garbled pages.
Since the fix of bug 265404 the tool didn't find any new crashers. I guess
Michal has to come up with a new version. :-)"
Hopefully the IE developers will quickly fix their problems too!
by Jeff Yang (special to San Francisco Gate,
Thursday, December 9, 2004) should worry the U.S. -- the
increasing gap between the innovations only available in Japan,
none of which are made or even available in the U.S., is concerning.
I did a little searching on how to filter out porn images and
other nasties, if you don't want them.
I found a OSS/FS implementation of an algorithm to detect porn images,
based on a larger project to detect 'bad' things called POESIA.
You can see an
see the "ImageFilter" and "Java" subdirectories for code,
and "Documentation" for - well, you can guess.
POESIA can also detect certain symbols, like swaztikas,
if you want it to.
There may this is just one I found.
On a related topic, you might find the article
by Joe Bolin
(July 1, 2004) interesting as well.
is an interesting survey of Programming it's teh
TIOBE Programming Community Index.
There are lots of others, if you like that sort of thing.
A wonderful page for computer people is
gives a nice quick way to access the
Wiktionary for definitions.
describes how to configure Linux-based operating systems so they'll use
Windows' proprietary Active Directory service (so you can
centrally manage accounts to do single sign-on).
is an interesting tale about experience with the Creative Commons license.
is important to think about.
Basically, copyright is creating a massive digital divide.
He aruges that copyright is now often used to support societies,
which means instead of supporting enlightment, it supports "elite-nment";
only a few have access to a lot of the scientific data now being developed,
and that money is not used to
is used to enrich those who did not do the research.
Optimization is sometimes necessary, but too many people forget to
measure before optimizing.
A far better example is the experience of people who use
-- a tool to help
visualize what happens when a Linux/Unix system boots.
Various distribution implementors have used this tool to speed up booting,
and its output is also interesting for showing what happens when a
Linux/Unix system starts up.
Let's use the term "hirabah", not "jihad", for the terrorist acts
being done in the name of Islam, and call a spade a spade.
has lots of very interesting information on trusted computing
(or as the FSF calls it, treacherous computing).
This is a controversial approach to transfer control of your computer
from owners to vendors.
is an interesting project to create
a video card especially for OSS/FS systems.
has more information.
You can also see the project
More information can be found on the mailing list front page and
the bottom of the interview.
More recently, .
is an interesting (and I think fundamentally correct)
analysis of a major trend in computing:
software will have to increasingly be designed to be concurrent,
because that will be the only way to best use the power of
the next generation's computers.
including glib.
There are various info sources for getting a laptop to run Linux.
If you don't want to do the fiddling yourself, several organizations
will sell you a laptop with Linux pre-installed, such as
sells such things
is a good place for
general news, information, tips, and so on, including the
The website
links to lots of information and tips on getting Linux systems to run well
on various specific laptops (useful if there's some trick to installation,
or a caveat on something that doesn't work well);
before you buy a laptop, look there for information about that
particular model.
An extremely important development is the
, which is now an approved OASIS committee draft.
Wouldn't it be wonderful if you could actually access the office
documents that you've created, without its formats being controlled
by any single vendor, so that you could pick the best product for
your needs?
(January 30, 2005) is a good
that explains why OpenDocument is important.
StarOffice and OpenOffice.org, KOffice,
Software GmbH's TextMaker
(who asked "is anyone using Microsoft Office XML for anything?"),
AbiWord, and
IBM's Workplace Client Technology.
OpenOffice.org and StarOffice include import/export filters for
Microsoft's .doc and Corel's Word Perfect formats, so they can
be used to transition between them.
he notes that the Apache Forrest project has plugins
for using OpenOffice documents as input to the Forrest processing pipelines,
He also reports that OpenDocument's main competitor,
Microsoft's "Office XML", it a terrible format:
"the Burrokeet project [found that it's] easier to use OOo as
a headless server that can convert MS Office to OOo format
and then work with the OOo XML files...".
In contrast, Ferdinand Soethe said,
"I'm impressed how easy it is to work with oo-xml".
Although they started to work with MS XML, it appears they plan to
deprecate that work, and switch entirely to using just OpenDocument.
; "the structure of what you get is amazingly twisted,
and it's painfully obvious that WordprocessingML
(formerly the catchier WordML) is a serialization of
internal structures in Word, not an XML vocabulary designed by
people who actually care about working with XML."
Gardler agreed wholeheartedly.
has an interesting way of putting it.
Here's how to get some
information about how OpenOffice.org's spreadsheet and Excel store their data.
In addition,
Of particular interest is OO.o's
OpenOffice.org is already much better than Microsoft Word for large
academic works, for at least two reasons.
First, it has a working master document capability.
Second, it has much better support for complicated bibliographies
and references (it uses a database for the references, so
you can auto-generate the bibliography with the right format).
This work has been the result of the
is working on still further improvements.
The ISO process that accepted the Microsoft XML (aka OOXML or EMCA 376)
was a nasty sham.
ISO seems to think it now has some control over the specification in order
to fix the
For more technical information on file formats,
has a collection of technical information.
is a nice overview of that topic.
I still think letter coloring is helpful for the LARGE number of
people who NEVER use non-Latin DNS names, but it's a good summary
of the alternatives.
If you own an Internet domain,
does a nice
job of some automated checks for common DNS problems.
(I haven't had a chance to see them
Here is an interesting URL:
a nifty}

我要回帖

更多关于 ueditor easyui 冲突 的文章

更多推荐

版权声明:文章内容来源于网络,版权归原作者所有,如有侵权请点击这里与我们联系,我们将及时删除。

点击添加站长微信