31.7.13
空空的对话
黄老板悟性高,人也年轻有志向,完全符合国家未来栋梁的模样。在最近等待转换轨道的途中,有闲情又有空暇,在知见和思维上愈见积极,又心理学又楞严经的,似乎有所悟。下面是最近跟黄老板的一段对话。
黄老板:你有朋友是研究脑的吗?我想知道关于意识的东西
在下:没有。大脑研究最近几年超热
黄老板:是啊?
在下:很多书籍。我有注意
黄老板:我只是因为这半年来在学佛的道路上明白了一些东西。所以想知道更多而已
在下:USM吉兰丹有人拿了一个超大的研究经费做大脑研究。会鼓励你超大脑研究这方面去看看。
黄老板:我想我有也是朝心理学方面研究。
在下:不是,我指的是大脑科学,不是心理学。
黄老板:没有兴趣。太过科学的东西让我觉得它有极限
在下:现代的心理学都是以佛洛伊德的的理论出发的,近几年有些争议。
黄老板: 我想以佛法出发,可以吗?哈哈。。。
在下:所以我认为心理学也是有限的
黄老板:只要不是逻辑的,就会无限了
在下:你要看人心噢?
黄老板:算是吧. 学佛那么多年,一直听人说空.现在就真的看见空了.
在下: 你先去修禅吧.
黄老板:是咯. 在修之前一直带着逻辑思考,就停在那里一直看不见它
在下:只是不知道你看到的空是否自己知见上的假象。
黄老板:应该不是吧
在下:不要轻易说空哦
黄老板:因为和另外两位有讨论过.他们看见的和我的有像.但是就是无法用言语说清
在下:可能最好你别去label它是空是实
黄老板:这个空,不能label的.以为它超出所有东西.如果是用空和实来做比较就不对了.它是在中间,不是空,也不是实,但是也可以说它是空,也是实。
在下:修禅的人,像你这样描述“空”,只怕对错号。
黄老板:对错也是在空里面
在下:老老实实修禅就好,乱乱自己做理论对号入座,对错也不能自知,很危险。
黄老板: 哈哈。。。那我又没有做理论
在下:最好不要咯
黄老板:我是在去年12月因为一些机缘下去看了楞严经.试着寻找所谓的心和第一因,那是我学佛那么一直不肯皈依的理由.因为没有人可以为我解释第一因
那么久,结果就看见了所谓的心.
在下:有领会哦。能够出真字了没有?
黄老板:真字?
在下:红公字啊
黄老板:什么来?
在下:万能啊,大马彩啊之类的
黄老板:有关事的昧?
在下:你好像悟了啊.
黄老板:悟什么?悟了也和真字无关吧
在下:哎呀。你怎么这么迟钝,悟无不到我的笑话
黄老板:是故意的.不过话讲回来,我真的想知道关于大脑和意识之间的东西
在下:那就不是心理学的东西了。是大脑神经学的范畴
黄老板:我不懂你是怎样看“心”这个东西啦.但是我本人发现它不是在脑里面的咯,所以心理学会跟接近它
在下:我说的是,你老老实实修禅之后,就会知道人心了啊
黄老板:人心本来就不存在
在下:心理学看的是世间的层面而已.人心存不存在只是一个哲学命题,跟人心无关.
黄老板:但是人常常为了心,做很多傻事,然后就开心、伤心、贪心、没有心等等
在下:那是现象而已啦。
黄老板:是咯。但是很多人都不懂喔,所以很多人都活在痛苦之中
在下L不要看外的。你看到了自己内在的,其他的都能懂了
黄老板:可以解释你这句话吗?
在下:你看到了自己内在的,其他的都能懂了。
黄老板:可以再说清楚些吗?
在下:我没力了说了。你自己解吧。。。
30.7.13
关于物理中的瞬间感应的对谈
黄老板:现在有人研究到teleport了吗?可不可以解释一下他们的理论给我听?
在下:有啊。有位马来西亚的物理学家很出名,在ANU, 澳洲国立大学,Ping Koy Lam。基本上那是一种源自爱因斯坦早期一篇叫做EPR paradox 的文章, 牵涉到一种叫做量子纠缠的现象
由于量子纠缠,资讯可以即是在两个隔离很远的空间点即时传递,因此违背了狭义相对论的说法。不知你听懂没有
黄老板:但是那样说,是不是必须要有一对的呢?
在下:当然啊,一颗粒子怎样定义两点空间?
黄老板:那他们怎样把它用在teleport?我的问题再简单一点就是说,要怎样把一粒苹果从一点teleport去另一点?
在下:你先制备一对EPR粒子对。你把资讯encode进其中一颗的量子态中,另一可马上就收到对应的量子态了。你teleport的不是物质,而是粒子的量子态.
黄老板:那就是说我们还不能把物质teleport咯?
在下:把量子态的资讯即时teleport到另一点空间,收到资讯后在把资讯重组到当地的粒子内去。
黄老板:那就好像clone那样咯?是吗?
在下:所以这两粒‘苹果’不是同一颗苹果,但从量子的角度而言,都是等同的,因为都是同样的量子态对.你很醒目,听懂我说什么。
黄老板:哈哈。谢谢。那我想问一个物质去到最简化时是资讯吗?还是还有一定的物质在?
在下:从某个加角度而言,讯息才是比物质更基本。
黄老板:那资讯是否可以不遵守物理法则?
在下:从物理的角度而言,物质是场。有最基本的场,但这场之间会交互作用,从一种场变为另一种场。所以有人说没有所谓的最基本场,只有相互作用所产生的整体效益,这是一种emergence的观点。但我的看法是,是否有基本场存在,只是观点和定义的问题。
黄老板:嗯。但是我的问题你还没答我哩。资讯是否可以不遵守物理法则?
在下:讯息不是物质场,但有遵守一些定律,而这些定律跟物资场的定律不一样,就像闪电跟石头遵守的方程式不一样。讯息的物理很抽象,我知道的不多,他跟热力学的关系很大。其中有个说法是讯息是守恒的,这是个最大的一个物理constraint.所以,讯息还是得遵守物理定律,只是其遵守的定律跟物质场的不完全一样。
黄老板:有个自然的现象常常发生在周围的最近我才开始去想,但是搞不明白。讯息的复制,是不需要有原料的是吗?就好比,如果我有个苹果,我要是给了你,我就失去那个苹果了。
但是我有一个讯息,我给了你,但是我还是有着那个讯息讯息和物质之间的关系是什么?
在下:不,讯息不是物资。就像电脑里的data一样,它需要载体才能存在,但他不是载体本身。空间也可以被当成一种载体。资讯就像电脑里的data一样,可以复制但仍然保留原貌(大致上而言这样,但考虑量子效应的话,故事的情节就有点不一样了。就像传薪火一样而已,火传了原来的火还在的。不要把资讯当物质来看待就是了。在资讯从一点传到另一点时,需要有通过某种物理机制才能办到,但几乎所以的物理机制都会造成对被读取及传播的资讯造成一定成度的破坏,所以被复制的资讯可能会失真。此为上面提到的量子效应。
黄老板:好的。明白。那一个苹果,物质是它的载体,苹果的所有特征都是讯息。这样说对吗?所以现在人类做到的是把一个苹果里的讯息转送到另一个地方去,然后再把讯息注入物质里让它重现一个一样特征的苹果?看回你上面的说的,我们把讯息重组进粒子就能得到一个一样的苹果,那这个步骤是不是到反回来做的呢?就是说把这里的苹果讯息分解,那粒苹果就能还原粒子?
在下:还原回来的苹果跟本来的那个苹果严格来说不是同一个苹果,但两者资讯一样。从量子力学的角度而言,这两者是identical的(全同)。就现实的效果来说,是不是同一个苹果已经没关系了,因为两者从任何现实的角度来说全无差异。就像你拷贝一首MP3,两者音质完全一样样。 你很聪明,听懂我的说话。
黄老板:谢谢夸奖。那么说只要不是生物,我们以后都可以把一个东西直接寄送到另一处咯
在下:teleport生物是一个非常有趣的课题。我知道的不多。如果我说多多,可能都是胡扯的。
黄老板:嗯。那现在我们能把粒子变成苹果,那可以把苹果变成粒子吗?
在下:“我们以后都可以把一个东西直接寄送到另一处咯” 这说法为之过早。我不是在之前说过吗?量子纠缠很容易(几乎无可避免)被teleport的物理过程干扰而被破坏。这样teleport就不能进行了。这是一项现实的问题。
黄老板:在这个世界的东西,我们有办法让它不遵守物理法则吗?
在下:物质,根据定义,都得遵守物理定律。除非,他不是物质。
黄老板:所谓的不是物质,有例子吗?
在下:意识。所有我们所知的物理存在都得遵守物理规律。我很难再想到其他例子。所有在宇宙里的都受物理定律支配。但意识就不能肯定了。
黄老板:所谓的意识,包过所有感受吗?
在下:是。
黄老板:你有看过一个故事叫The ghost in the shell吗?它是说以后的世界,人类可以把意识转存去另一个所谓的物体上。matrix那部戏的idea也是取之这个故事的
在下:是意识的话,就可能可以脱离物理定律吧。
(待续)
29.7.13
Our tools for computational materials simulation
For the purpose of performing computational materials simulations (CMS), one very essential factor is the availability of some highly specialised computational tools. Not only we need these tools, equally important is the know-how to use these tools. When I mention 'tools' I have in my mind hardware and software tools.
Hardware resources we have in the physics school are 'moderately abundant'. We have cluster PCs and workstations, totaling up to around 256 cores. The main purpose of these hardware is to provide processor cores for intensive computational purposes. For better performance, these computers have to also equipped with good amount of RAM. In the long term, when the group's size expands, the available hardware resources may become very tight. But at the moment we are quite comfortable with what we have.
Software tools can further be broken down into two levels of complexity: (1) running the software, and (2) performing post-processing on the data churned out by these software.
The software tools that are available to us include: ABINIT, Wien2k, DeMon, LAMMPS, DFTB+, HOTBIT. We also have licensed software VASP and Crystal, but we have not been very skillful in using these compared to the former. We also have experienced with Gaussian and Material Studio, but unfortunately we do not has full access to the licensed version of these. All these software mentioned above (except VASP and Crystal) have been fully installed in our hardware. Installation and running VASP and Crystal are not a priority for the moment as we have more software than we can digest.
There are basically three major categories of software code for the purpose of CMS. DFT software (ABINIT, Gaussian, VASP, Crystal, DeMon), tight-binding DFT (DFTB+, HOTBIT), and molecular dynamics software (LAMMPS). Basically we have installed these software in our computational resources (most are rendered to run in parallelised modes). And most importantly, we have (I mean our group members) gained enough experience to use them (except for VASP and Crystal).
However, when running CMS calculation, very often we need to know what are the ground state (GS) structures at zero temperature, i.e., configurations that correspond to the lowest energy. If a system is to be stable (hence corresponds to a state observable in the lab), it should be a GS, or a meta stable state (which is not as stable, and is subjected to transition to other nearby meta-stable states upon thermal perturbation). Finding GS structure by itself is a difficult task, since there are in principle almost infinitely many possible ways how a system can be configured in a finite 3D space. To this end, we need some intelligent ways to help us finding GS. This can be achieved via genetic algorithm (GA) and basin hoping (BH). Most often, one has to couple GA and/or BH into the above software to obtain the GS of a system. BH and GA programs for locating GS purposes are usually standalone codes developed by individual research group. However, GASP, a GA program has recently been developed by the Cornell materials engineering group, and implemented into LAMMPS. Although the GASP+LAMMPS have been configured to run in our computer nodes, we have yet to develop the know-how to use them for our GS-finding purpose. This is due to the reason that we have not had the time to explore the GASP codes and to tune the GA parameters.
A GA (or BH) coupled to LAMMPS, or DFTB+, allows one to search for the GS at zero temperature. Without GA or BH (or other global-minimisation search algorithms) one can not confidently claim a given configuration of atoms is in a stable state. A calculation for the total energy of such an arbitrary configuration at zero temperature cannot be convincingly claimed to be the global minimum. Having a workable GS-searching capability (which has to be coupled to the CMS software) is what we have been trying to acquire for sometimes. The good news is we have just acquired such an "upgrade" recently. Well, we did not do it ourselves. All credit goes to Prof. Lai's group in the NCU complex fluid research lab. Prof. Lai's group has developed a highly effective parallelised GA code that could be easily coupled to LAMMPS and DFTB+. On top of this, they also has a BH code that could also be used in place of GA as a stand-alone optimisation tool. We simply borrow their achievement which they developed with tears and blood. The next thing we want to develop is to couple the GA and BH with DFT software. Technically it is possible to implement such a software-coupling. But we also anticipate a serious penalty on the speed which is hampered by the expensive calculation cost of the DFT codes.
In most cases, all output from the CMS calculations (be it DFT, DFTB+ or LAMMPS) have to be post-processed. And usually the post-processing of these data have to be manually done. The software packages usually provide very limited functionality for post-processing, except some very common ones. For example, plotting the electronic density from DFT calculation is a norm, hence can be easily done by using the post-processing packages come together with, say, Gaussians' gaussiview. However, for MD and DFTB+, almost no such post-processing tools exist. We have to write out own codes to perform post-processing. For example, we have to write our own Mathematica code to calculate the average bond length of certain selected atomic species in the LAMMPS output data. To calculate the point group of a bunch of Boron atoms in a cluster in a typical DFTB+ output, we have to manually do this using an independent software that calculate point group based on the coordinates of the atoms.
Of course, there are a lot of potential technical issues that could hinder us from completing the intended calculation. The most serious hindrance comes in the form of the unavailability of force fields, or unsuitability of the force fields (in the case of LAMMPS), or SK files (in the case of DFTB). In DFT calculation, there is no issue of forcefields or SK files, but the problems in DFT are non-convergence, or simply taking too long hours to complete the calculation.
The point I wish to highlight is that, after an extended period of hard work, we have gained a lot of experience in installing, maintaining, using and manipulating these specialised software mentioned above. These technical know-how are by no means easy, but have to be earned via many hours of learning and effort. Once we have identified a problem to attack, we are now in a better position to go ahead confidently. In order for a new comer to perform a CMS, he/she will have to master all these software tools mentioned above, apart from knowing the physics and the theories. Even if she/he can't be a great CMS scientist, at least she/he will be a well-trained computer expert in Linux, post-processing, programming, and visualisation.
Hardware resources we have in the physics school are 'moderately abundant'. We have cluster PCs and workstations, totaling up to around 256 cores. The main purpose of these hardware is to provide processor cores for intensive computational purposes. For better performance, these computers have to also equipped with good amount of RAM. In the long term, when the group's size expands, the available hardware resources may become very tight. But at the moment we are quite comfortable with what we have.
Software tools can further be broken down into two levels of complexity: (1) running the software, and (2) performing post-processing on the data churned out by these software.
The software tools that are available to us include: ABINIT, Wien2k, DeMon, LAMMPS, DFTB+, HOTBIT. We also have licensed software VASP and Crystal, but we have not been very skillful in using these compared to the former. We also have experienced with Gaussian and Material Studio, but unfortunately we do not has full access to the licensed version of these. All these software mentioned above (except VASP and Crystal) have been fully installed in our hardware. Installation and running VASP and Crystal are not a priority for the moment as we have more software than we can digest.
There are basically three major categories of software code for the purpose of CMS. DFT software (ABINIT, Gaussian, VASP, Crystal, DeMon), tight-binding DFT (DFTB+, HOTBIT), and molecular dynamics software (LAMMPS). Basically we have installed these software in our computational resources (most are rendered to run in parallelised modes). And most importantly, we have (I mean our group members) gained enough experience to use them (except for VASP and Crystal).
However, when running CMS calculation, very often we need to know what are the ground state (GS) structures at zero temperature, i.e., configurations that correspond to the lowest energy. If a system is to be stable (hence corresponds to a state observable in the lab), it should be a GS, or a meta stable state (which is not as stable, and is subjected to transition to other nearby meta-stable states upon thermal perturbation). Finding GS structure by itself is a difficult task, since there are in principle almost infinitely many possible ways how a system can be configured in a finite 3D space. To this end, we need some intelligent ways to help us finding GS. This can be achieved via genetic algorithm (GA) and basin hoping (BH). Most often, one has to couple GA and/or BH into the above software to obtain the GS of a system. BH and GA programs for locating GS purposes are usually standalone codes developed by individual research group. However, GASP, a GA program has recently been developed by the Cornell materials engineering group, and implemented into LAMMPS. Although the GASP+LAMMPS have been configured to run in our computer nodes, we have yet to develop the know-how to use them for our GS-finding purpose. This is due to the reason that we have not had the time to explore the GASP codes and to tune the GA parameters.
A GA (or BH) coupled to LAMMPS, or DFTB+, allows one to search for the GS at zero temperature. Without GA or BH (or other global-minimisation search algorithms) one can not confidently claim a given configuration of atoms is in a stable state. A calculation for the total energy of such an arbitrary configuration at zero temperature cannot be convincingly claimed to be the global minimum. Having a workable GS-searching capability (which has to be coupled to the CMS software) is what we have been trying to acquire for sometimes. The good news is we have just acquired such an "upgrade" recently. Well, we did not do it ourselves. All credit goes to Prof. Lai's group in the NCU complex fluid research lab. Prof. Lai's group has developed a highly effective parallelised GA code that could be easily coupled to LAMMPS and DFTB+. On top of this, they also has a BH code that could also be used in place of GA as a stand-alone optimisation tool. We simply borrow their achievement which they developed with tears and blood. The next thing we want to develop is to couple the GA and BH with DFT software. Technically it is possible to implement such a software-coupling. But we also anticipate a serious penalty on the speed which is hampered by the expensive calculation cost of the DFT codes.
In most cases, all output from the CMS calculations (be it DFT, DFTB+ or LAMMPS) have to be post-processed. And usually the post-processing of these data have to be manually done. The software packages usually provide very limited functionality for post-processing, except some very common ones. For example, plotting the electronic density from DFT calculation is a norm, hence can be easily done by using the post-processing packages come together with, say, Gaussians' gaussiview. However, for MD and DFTB+, almost no such post-processing tools exist. We have to write out own codes to perform post-processing. For example, we have to write our own Mathematica code to calculate the average bond length of certain selected atomic species in the LAMMPS output data. To calculate the point group of a bunch of Boron atoms in a cluster in a typical DFTB+ output, we have to manually do this using an independent software that calculate point group based on the coordinates of the atoms.
Of course, there are a lot of potential technical issues that could hinder us from completing the intended calculation. The most serious hindrance comes in the form of the unavailability of force fields, or unsuitability of the force fields (in the case of LAMMPS), or SK files (in the case of DFTB). In DFT calculation, there is no issue of forcefields or SK files, but the problems in DFT are non-convergence, or simply taking too long hours to complete the calculation.
After the calculations have completed, then we have another kind of issue to address, namely, the reliability of the results, and the physical interpretation of these results. The former are usually easier than the later. We need to know a lot of physics in order to write a meaningful sentence to address the physical interpretation of the results. This is the part we have to keep improving and learning from Prof. Lai. In comparison, running calculation is relatively easier then assuring the correctness of the calculation, or interpreting the physics of the results.
訂閱:
文章 (Atom)