# HG changeset patch # User HackBot # Date 1357248087 0 # Node ID 5b377dc03f48dd2a788f472a777ac285f166b3f2 # Parent 01aa09301acf0d8614f60c09a14c991025287ebf pastelogs JIT diff -r 01aa09301acf -r 5b377dc03f48 paste/paste.28983 --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/paste/paste.28983 Thu Jan 03 21:21:27 2013 +0000 @@ -0,0 +1,301 @@ +2004-05-27.txt:21:29:16: and a JIT style intepreter would have to do a lot of processing to deal with self-modification +2004-09-13.txt:11:30:57: the "jit-compiling" one? +2005-05-07.txt:11:44:08: If you just want to run an interpreter, you could do it with binfmt_misc. JIT-compilation of brainf*ck code to native, in the kernel, would rule, though. :p +2005-06-09.txt:09:47:30: cool.. my Java install was broken.. even http://kidsquid.com/EsoShell works in IE6 now.. boy is Sun Java slow compared to the Microsoft JIT.. took about 10 minutes to load the applet in qemu +2005-07-24.txt:20:06:04: <{^Raven^}> what about a JIT interpreter? +2005-07-24.txt:20:10:36: dynamic recompiling? jit compiling? +2005-07-24.txt:20:10:45: vm's often jit compile +2005-07-24.txt:20:11:30: dynamic recompiling is a special form of jit compiling +2005-07-24.txt:22:56:32: that's like jit compiling +2005-07-24.txt:22:57:46: if you compile all loops as soon as they get executed it's real jit compiling if you always translate the next n instructions it's dynamic recompiling +2005-07-25.txt:20:56:28: GregorR: but you could call it jiting interpreter... +2005-07-25.txt:20:56:28: GregorR: a jit-compiler compiles a routine before it gets executed.. bf code has no subroutines so you may call it jit-compiling ;) +2005-07-27.txt:23:14:34: creates egobfc2m binaries? or does it "jit" compiling? +2005-07-27.txt:23:14:48: egobfc2m is jit-ish. +2005-07-28.txt:23:49:05: anyway, GregorR, I was just kidding; I like the idea of having a kind-of-JIT for BF +2005-07-28.txt:23:53:59: i'd like to combine my optimizations with your jit-a-like interpreter +2005-09-16.txt:05:33:08: I know they're working on a JIT, but that's hardly the same... +2005-10-15.txt:14:05:34: and my interpreter uses cool jit compiling technologies (called eval ;) ) +2005-10-19.txt:23:13:55: Mainly the JIT part. +2005-12-01.txt:16:26:12: have someone ever tried to write JIT VM ? +2006-01-15.txt:23:58:09: EgoBF also has a compiler and a JIT compiler. +2006-01-19.txt:23:27:27: It's part of my EgoBF suite, a JIT compiler for BF. +2006-03-02.txt:16:14:08: qemu basically JITs things +2006-08-24.txt:22:38:24: It has a JIT compiler. +2006-08-26.txt:22:18:51: You idjit. . . +2006-08-27.txt:20:23:29: JIT = teh rawx +2006-10-23.txt:21:33:41: OH, that's not a backronym you idjit >_< +2006-10-25.txt:02:54:37: it's sad when a _webpage_ will jitter and sputter on my computer +2006-12-31.txt:03:22:36: The kernel should have interpreters (perhaps JIT) for Glass, BF, etc) +2006-12-31.txt:20:47:33: Of course it's written in C ya' bloody idjits :P +2007-06-22.txt:23:08:54: Nah. Lisp is JIT compiled. +2007-06-22.txt:23:09:37: JIT is effectively the same as interpreted from a workflow perspective- it's just an abstraction layer that makes things more zippy +2007-07-11.txt:01:30:36: nice, JIT compilation provided by oklopol :D +2007-09-14.txt:18:46:02: <_D6Gregor1RFeZi> http://images.google.com/images?q=fujitsu+stylistic+1200 +2007-10-14.txt:00:34:10: I'd like to imagine something that could be implemented in a JIT. +2007-10-27.txt:23:04:28: python is slow, lua's "end"s are ugly (but it is very fast, more so with luajit)... +2007-10-29.txt:20:39:08: hmz, maybe you should try jitting the bf +2007-11-26.txt:20:42:11: so when i do, i'm all nervous and jittery for the rest of the day +2008-03-07.txt:16:21:41: possibly you could do some JIT optimization, like say, pre-compile paths or something +2008-03-07.txt:22:19:15: mmmm, jit +2008-03-07.txt:22:19:25: "jit"? +2008-03-07.txt:22:19:33: oops, JIT +2008-03-11.txt:19:24:15: Deewiant, well JITs does work around it +2008-03-13.txt:18:32:20: oklopol, JIT should be possible +2008-03-13.txt:18:34:38: jit +2008-03-13.txt:18:36:07: that's JIT +2008-03-13.txt:18:38:33: with befunge you can do either threaded code or JIT +2008-03-14.txt:00:13:48: or jit +2008-03-14.txt:00:19:31: lament: when the code is modified, it's trivial to fix the stack+goto based form, compilation to python can be done jit then. +2008-03-14.txt:02:44:01: oklo, so JIT I assume? +2008-03-14.txt:03:13:27: - make a befunge compiler without an existing program that needs jit compilation +2008-03-14.txt:03:14:17: - make a befunge compiler without an existing program that needs jit compilation +2008-03-14.txt:03:19:25: oklo, but one without JIT? +2008-03-14.txt:03:20:03: AnMaster: well, i don't even have the possibility of jit in my current one +2008-03-14.txt:03:20:04: oklo, basically it is impossible to design a befunge compiler that does NOT need jit +2008-03-14.txt:03:20:15: sure for most common cases you can do reparsing/ijit +2008-03-14.txt:03:20:17: jit* +2008-03-16.txt:14:59:34: so a JIT wouldn't help much +2008-03-16.txt:18:39:23: Deewiant, btw I've been thinking of some kind of JIT for some things, but I think they may end up slower, ie. caching the matching ; for a ; and such +2008-03-16.txt:18:46:57: something that can be JIT compiled or so +2008-03-17.txt:22:20:02: you could jit stuff +2008-03-17.txt:22:20:45: if a region is executed a lot with no change, JIT it, but look out for changes to an "dependents" the region has +2008-03-17.txt:22:23:23: SimonRC, JIT would mean compile to machine code? +2008-03-17.txt:22:24:34: AnMaster: so how to JVMs do JITting? +2008-03-17.txt:22:26:34: wait, you want me to do something 1000 JMV and .NET engineers can't and produce a protable JITter? +2008-03-17.txt:22:27:38: ais523, right, but how would that work for JIT? +2008-03-17.txt:22:28:12: hmm... for JIT you'd probably have to precompile some example asm +2008-03-17.txt:22:32:20: anyway it doesn't make sense for a JIT as far as I can see +2008-04-03.txt:11:09:41: and JIT +2008-04-04.txt:20:59:34: <3 optimistic JIT compilation +2008-04-04.txt:21:04:26: GregorR: Did you know that $YOUR_FAVORITE_DYNAMIC_LANGUAGE probably doesn't JIT as fast as HotSpot can? +2008-04-04.txt:21:04:59: $MY_FAVORITE_DYNAMIC_LANGUAGE doesn't have a JIT as-is :P +2008-04-14.txt:19:34:31: JITted javascript? +2008-04-14.txt:19:36:06: mmm, partially-precompiled JITted Javascript with big fat libraries +2008-06-16.txt:09:39:38: I wrote a JIT MIPS->JavaScript compiler (in JavaScript) ... does that make me a bad person? D-8 +2008-06-16.txt:16:01:20: 01:39:38 I wrote a JIT MIPS->JavaScript compiler (in JavaScript) ... does that make me a bad person? D-8 +2008-06-16.txt:20:30:52: (But hey, now it has a JIT and read()!) +2008-06-17.txt:16:06:44: it even JITs it +2008-06-27.txt:16:30:38: ais523, what about JITing? +2008-06-27.txt:16:30:48: * AnMaster has pondered JIT of befunge for quite some time +2008-06-27.txt:16:30:58: but JIT is unportable +2008-06-27.txt:16:37:34: ais523, what about JITing? +2008-06-27.txt:16:37:34: * AnMaster has pondered JIT of befunge for quite some time +2008-06-27.txt:16:37:34: but JIT is unportable +2008-06-27.txt:16:38:04: and I thought JIT was a compilation technique, so how can it be unportable? +2008-07-05.txt:09:25:37: I think it could be possible to JIT it though +2008-07-05.txt:09:25:54: JIT? +2008-07-05.txt:09:26:19: Slereah_, JIT compile it +2008-07-07.txt:21:55:13: is it JITTING?! +2008-07-07.txt:21:55:18: AnMaster: worse than jitting +2008-07-07.txt:21:55:23: worse than jitting? +2008-08-07.txt:20:12:49: user-space that is, basically just a normal linux kernel, a hardware-specific llvm jit interpreter, and then every binary is just llvm bitcode files +2008-08-08.txt:20:44:52: tusho, well.. you could probably JIT befunge quite well +2008-08-08.txt:23:08:49: * pikhq wants to make a JITing brainfuck interpreter now... :p +2008-08-24.txt:00:04:00: ZP02S1sQL6XvaUSK3J6hGb5WKXjIrC4lVweYsYd6G6yQVzVfVzOThzUxOuP9vOJUqVKLhFcTs1smP0OK4Jb54NhWNVB1pWARt1EBsQo4iuGXZ9A3ICozTNeSsTzYznVygfJEwKuJOVd7xbMJItHYZu1vy7pXp87BEIqRyF3SSVk6Utne2SGVrv36VTeqR7ThBeMP45olUtZOapRSDP9BrkWbRuttjjhaTFK13D0dZXl7hT7LRvZ5Koi2zMRqX6s4ewxxXIToTDb0TlgwM5oEdo7fI3WErmOIYo3423n042IfJT87ecR51HySeCUBWOPc4xCliNSyloM6scQrwMMJxtDxK7THJbmBKgrDJKQecFiI1zMtdtZHNLQf1XfaADXHVKTUwNjSCHEg1bNqAoEc +2008-09-01.txt:19:18:21: oh btw I was thinking of generating code at runtime, using JIT with LLVM, could be rather interesting +2008-09-01.txt:19:18:29: I think JITing funge would work +2008-09-05.txt:19:57:42: tusho, my other idea: Befunge: you could probably JIT it +2008-09-05.txt:19:58:34: tusho, can't find the word JIT there? +2008-09-05.txt:19:59:07: threaded code != JIT +2008-09-05.txt:20:00:24: and it is not JIT really for B98 +2008-09-05.txt:20:04:00: tusho, it could use some JIT framework to do it... +2008-09-08.txt:19:38:42: you'd all be jittering about quickly while nothing happens +2008-09-17.txt:22:07:30: I think you could JIT it though +2008-09-22.txt:10:20:45: you could JIT it +2008-09-22.txt:10:28:18: there is also a JIT for it iirc +2008-09-22.txt:10:29:31: llvm byte code is more or less platform independent. You can then either interpret it (it JITs it) or you can compile it into machine code and link it to a binary +2008-09-22.txt:19:01:01: ais523, yes. Except I have seriously considered adding JIT using llvm or similiar to cfunge +2008-09-23.txt:18:14:48: sort of like JIT debugging but even crazier +2008-09-26.txt:23:15:20: it's a bit like JIT compilation, just stupider +2008-10-02.txt:21:40:18: ais523, anyway llvm allows generating native code, or jit byte code +2008-10-06.txt:20:02:43: ais523: jit +2008-10-06.txt:20:03:16: (It's not strictly JIT since BF has no functions, so the whole thing is compiled at once, but it's about as close as you can get :P ) +2008-10-11.txt:14:10:22: so no JIT? +2008-10-12.txt:01:31:12: (If only the translator could translate itself... I don't think it's rpython though. But if the translator can get the JIT to be fast someday, and a fast JITted pypy runs the translator... well then that's pretty amazing) +2008-10-29.txt:21:47:44: If I really wanted to make fungot fast, I'd run it with some JITting system; it doesn't ever do any self-modification, and really only uses cardinal directions, so static code analysis + constant-folding + JIT complication should make it fearsomely fast. +2008-10-29.txt:22:56:55: Didn't I just say about using JIT with fungot? :p +2008-10-29.txt:22:57:20: fizzie: this would be compile-time, not JIT +2008-10-29.txt:23:01:28: a tracing jit might be easier +2008-11-03.txt:12:26:08: static typing is basically only an issue when you're jitting the code. +2008-11-05.txt:20:13:38: fis@eris:~/src/jitfunge$ build/jitfunge test.b98 +2008-11-05.txt:20:14:20: fizzie, JIT to native code? +2008-11-05.txt:20:27:17: ehird, this jit I mean... +2008-11-05.txt:20:31:21: it's jit +2008-11-05.txt:20:32:47: as it is jitting you would probably generate optimised linear code +2008-11-05.txt:20:41:02: There's a tarball at http://zem.fi/~fis/jitfunge-export.tar.gz but don't expect anything actually working, or any support whatsoever, or even that it'd work on a non-Linux box; I think there's a mremap call in there, I'm not sure how well-supported that is. (Easy to eliminate, though.) +2008-11-05.txt:20:43:21: src/parser.cc: In constructor 'jitfunge::Trace_impl::opargs::opargs()': +2008-11-05.txt:20:52:38: jitfunge jittool +2008-11-05.txt:20:52:45: Incidentally, the scons script should generate both jitfunge and jittool; the 'jitfunge' part is the interpreter (such at it is) but the jittool is maybe more interesting. +2008-11-05.txt:20:53:10: The jittool one takes file (plus optional four integers, x y dx dy) and generates + compiles a single trace, then dumps the generated code. +2008-11-05.txt:20:54:52: #3 0x08054749 in jitfunge::AsmFunction::operator() (this=0x841b058) at src/codegen.hh:274 +2008-11-05.txt:21:47:49: fis@eris:~/src/jitfunge$ cat test.b98 +2008-11-05.txt:21:47:49: fis@eris:~/src/jitfunge$ build/jitfunge test.b98 +2008-11-05.txt:21:51:45: ... Funging JIT? +2008-11-05.txt:21:51:52: pikhq: yes, befunge-98 jit +2008-11-05.txt:21:52:17: It's not like jitfunge really does anything very -98y yet, though. +2008-11-05.txt:21:57:29: and presumably jitfunge +2008-11-05.txt:22:05:12: /home/arvid/local/llvm/bin/llvm-g++ -o build/jitfunge -m32 -ggdb build/main.o build/codegen.o build/interp.o build/parser.o build/space.o +2008-11-05.txt:22:22:14: src/parser.cc:63: warning: missing initializer for member 'jitfunge::StackChange::flush' +2008-11-05.txt:22:22:15: src/parser.cc:63: warning: missing initializer for member 'jitfunge::StackChange::in' +2008-11-06.txt:10:39:53: but for JIT that may not help a lot +2008-11-06.txt:13:09:33: src/interp.cc: In constructor 'jitfunge::Stack::Stack()': +2008-11-06.txt:14:12:34: You can run it as "jitfunge file.b98 -d" to make it dump the traces it generates. +2008-11-06.txt:14:13:50: oh god jitfunge +2008-11-06.txt:14:36:03: (build/jitfunge life.bf > life.txt &); sleep 20 ; killall jitfunge ; ls -l life.txt generates around 8.5 megabytes of output, compared to ~2.7 megs from cfunge (32bit, -O3, no fancy flags), here. I'm not sure jitfunge is in a benchmarkable state yet, though. +2008-11-06.txt:14:42:15: fizzie, well for me both generates around 5 MB, sometimes slightly more for cfunge, sometimes more for jitfunge +2008-11-06.txt:14:44:49: 5 mb from jitfunge, around 1.8 from cfunge +2008-11-06.txt:14:45:10: fizzie, however I don't know how large setup time jitfunge needs. cfunge have quite a bit of setup time +2008-11-06.txt:14:50:36: There's quite a lot of small code snippets generated for life.bf, increasing the overhead there. Every time there's a branch or a merging of two code paths, jitfunge splits the code to separate functions there. +2008-11-06.txt:14:51:31: you try to jit compile befunge? +2008-11-06.txt:14:52:26: "you to jit compile befunge"? +2008-11-06.txt:15:00:19: Given that it's JIT, I should probably be collecting some statistics and doing branch prediction that way. +2008-11-06.txt:15:14:04: fizzie, if you discard some jitted code, and the jitted code is placed in a mmaped region, how do you allocate new ones, try to find the first hole large enough? +2008-11-06.txt:19:01:26: Meh, sensiblized the code generated by jitfunge, and managed to cut life.bf performance into 4 % (variant 1) or 30 % (variant 2) of what it used to be; the new system does generate longer pieces of code, but it ends up recompiling something all the time. I need a figure out a less complicated test case for the issue, though. +2008-11-06.txt:20:25:34: The jitfunge stack grows up, but that's pretty arbitrary. +2008-11-06.txt:20:41:03: This week I've been mostly writing jitfunge. :p +2008-11-07.txt:06:48:35: Oh, wow... added -O3 to the build arguments in jitfunge just to see if the compiler affects the speed at all. life.bf speed jumped from 10 megs / 20 seconds -> 37 megs / 20 seconds. Even though it doesn't really change the generated code at all. Should profile the beast a bit, I guess. +2008-11-07.txt:13:30:25: fis@eris:~/src/jitfunge$ cat test.b98 +2008-11-07.txt:13:30:25: fis@eris:~/src/jitfunge$ build/jitfunge test.b98 +2008-11-08.txt:00:51:18: I don't think I'm in any condition to work on jitfunge right now. +2008-11-09.txt:17:18:18: fizzie, question: does jitfunge handle x? +2008-11-09.txt:19:19:15: jit +2008-11-09.txt:19:19:52: * SimonRC doesn't know anything about jitting +2008-11-09.txt:19:34:27: ("function" here meaning I JIT-compile things into callable functions; of course there's no functions in the Befunge code.) +2008-11-09.txt:21:29:14: I sort of do a "stack-language to registery machine" thing with jitfunge, given that the input is Befunge and the output is x86 code. +2008-11-10.txt:11:52:27: Incidentally, jitfunge seems to be able to run underload.b98 now; still woefully incomplete, and don't really have free time to work on it, though. +2008-11-10.txt:15:11:39: fizzie, any updates to jitfunge? +2008-11-10.txt:17:25:35: ais523: it's a befunge-98 JIT +2008-11-11.txt:14:12:01: Tried to compile jitfunge on OS X, but I get a "no return statement in function returning non-void" error (-Werror) from the C++ system header . Not nice. +2008-11-12.txt:17:53:40: I should fix jitfunge's mycology regression. +2008-11-12.txt:17:53:56: how far through mycology does jitfunge get? +2008-11-12.txt:18:03:28: It's still already 384 lines of output with jitfunge's "-d" flag; I'd like something that generates only one or two compiled sort-of-functions so I can just disassemble them and see what goes wrong. +2008-11-12.txt:18:48:01: jitfunge is, I think, C, if that's what you meant +2008-11-12.txt:19:18:25: Hah, that was a funny jitfunge bug; when adding the 'IF' operation it didn't clear the constant-folding-stack, so something like 1#$_ would add "push 1" to the op list, then "if", then see the $ (since it branch-predicts true always) and remove the "push 1" because it thought it was discarding a constant. +2008-11-12.txt:19:34:01: Substituting the -O3 flag with -ggdb did nothing, but it's funny how I get different output from "build/jitfunge mycology.b98" than "build/jitfunge mycology.b98 | cat". +2008-11-12.txt:19:57:34: Somehow I'm not too surprised jitfunge doesn't play nice with valgrind. +2008-11-15.txt:21:09:57: GregorR: apparently you found a way to break Firefox 3.1's JIT, well done :D +2008-11-17.txt:21:21:44: why not JIT it into pure perl +2008-11-17.txt:21:24:23: I'm not entirely sure JITting it into pure Perl makes a whole lot of sense +2008-11-17.txt:21:26:08: then JIT the perl into C which you JIT into native code? +2008-11-17.txt:21:26:28: does Perl JIT into C? +2008-11-18.txt:19:15:05: I do that in jitfunge. +2008-11-18.txt:19:17:21: I have a very preliminary tarball of jitfunge in the web, but that's really not a pleasant thing to read. +2008-11-18.txt:19:17:22: fizzie: doesn't jitfunge do something like that? +2008-11-18.txt:19:19:03: http://zem.fi/~fis/jitfunge-export.tar.gz if I recall the URL correctly. +2008-11-18.txt:19:23:14: It's there in the "AsmFunction" class if you want to look at how jitfunge does it. +2008-11-23.txt:22:29:52: fizzie, how goes jitfunge? +2008-11-23.txt:22:38:06: AnMaster: Well, feel free to, but that's what I've been doing instead of jitfunge. +2008-11-29.txt:00:24:54: nooga: It's slowish ... but I have a JIT >: ) +2008-11-29.txt:06:58:22: * GregorR just improved the JIT ... it's a bit faster now. +2008-12-03.txt:14:36:15: fizzie, any progress on jitfunge? +2008-12-28.txt:17:21:12: as far as I know I currently beat all except jitfunge +2008-12-28.txt:17:21:24: and last I heard jitfunge wasn't complete +2008-12-30.txt:20:50:05: fizzie, there? Any progress on jitfunge? +2009-01-04.txt:19:34:54: ais523, sure, what about proving a jit compiler! +2009-01-05.txt:10:16:46: fizzie, progress on jitfunge? +2009-01-07.txt:09:58:29: something for jitfunge rather +2009-01-07.txt:20:48:27: ehird, I don't know if jitfunge works on os x +2009-01-07.txt:20:49:06: I'd hardly expect a jit to be portable :P +2009-01-07.txt:20:49:42: jitfunge doesn't, at the moment; although it might with some tweaking. +2009-01-07.txt:20:50:25: We did some speed-benchmarking with something like (build/jitfunge life.bf > life.txt &); sleep 20 ; killall jitfunge ; ls -l life.txt and then comparing the life.txt output size. Silly but... silly. +2009-01-07.txt:21:03:44: "originally designed and programmed by Alexey Pajitnov in June 1985, while working for the Dorodnicyn Computing Centre of the Academy of Science of the USSR in Moscow." +2009-01-08.txt:16:51:43: jitfunge uses a fixed-address mmap. +2009-01-09.txt:17:28:34: if you want to JIT stuff you probably need to use the API +2009-01-09.txt:17:38:21: Also has a JIT +2009-01-09.txt:17:38:40: so it uses a custom written JIT for x86 and x86_64 (it says so in README) +2009-01-09.txt:17:39:16: and the JIT actually works? +2009-01-09.txt:17:39:40: writing a good JIT takes time, lots of time +2009-01-09.txt:17:41:26: so he's developed a prototype with a working jit in less than a month. +2009-01-09.txt:17:42:05: ehird, well I can believe someone managed to write a well working JIT in less than a month if he was dedicated, and a OO lang... But both? No way +2009-01-09.txt:17:58:18: ehird, writing your own JIT? +2009-01-11.txt:18:26:54: flexo, iirc fizzie traps segfault in jitfunge +2009-01-13.txt:18:58:10: ehird, also I admit jitfunge is faster +2009-01-13.txt:19:03:18: I admit that jit compilers like jitfunge are faster +2009-01-13.txt:19:27:31: ehird, also if you decide to jit I'm not going to care, I don't have the time to add jitting to cfunge currently, no idea about later +2009-01-13.txt:19:28:26: it's the jitterbug +2009-01-14.txt:15:21:50: (and then fizzie went and invented jitfunge, just for even more crazy-speed funge fun) +2009-01-14.txt:15:45:15: hm, this would be one thing that is easier in a JIT I believe +2009-01-14.txt:15:45:32: well, JITs need to do that anyway +2009-01-14.txt:15:46:08: doesn't mean you can't do it in a non-JIT +2009-01-18.txt:14:28:27: then I think I'll make it a compiler/jit and write it totally in scheme and stuff +2009-01-18.txt:14:54:23: second version - compiler & JIT +2009-01-18.txt:14:57:30: ehird, you said compiler/jit? +2009-01-18.txt:14:58:22: instead of jit +2009-01-18.txt:14:59:06: ehird, basically: will it jit or be able to compile stand alone binaries? +2009-01-24.txt:22:15:29: i don't feel like wanting to implement a JIT :) +2009-01-28.txt:14:23:55: ehird, also I wonder how fast it would run game of life? as fast as jitfunge? +2009-02-10.txt:15:00:45: i plan to run it using a llvm jit +2009-02-21.txt:20:42:41: jit +2009-02-21.txt:23:40:50: but it's more jit-style +2009-02-26.txt:12:10:54: fizzie, in jitfunge, which way does the stack grow? +2009-02-26.txt:12:21:18: Actually, heh, I don't think current jitfunge even grows the stack at all. Haven't touched that code in a while, but all I'm seeing here are the underflow checks. +2009-02-27.txt:16:17:41: ehird: JIT? +2009-02-27.txt:16:17:52: well JIT is a solution of course +2009-02-27.txt:16:17:58: ais523: it is JIT, kind of +2009-02-27.txt:16:18:00: I would certainly suggest JITted constant folding +2009-02-27.txt:16:18:17: kind of like JIT +2009-02-27.txt:16:18:30: ais523, that is AOT not JIT right? +2009-02-27.txt:16:19:54: ehird, I think JIT with constant fold, and if +- and such are redefined invalidate all JIT compiled code using it +2009-02-27.txt:16:21:50: maybe you could use some sort of JIT constant unfolding? +2009-02-27.txt:16:22:16: you wouldn't JIT it again directly +2009-02-27.txt:16:22:28: I just told you I'm not JITting +2009-02-27.txt:16:22:44: JITing is probably better for this though... +2009-02-27.txt:16:23:13: but the Java JIT can do inlining and such +2009-02-27.txt:16:23:44: ehird, well mark a unit as "need to be-rejitted on next use" then +2009-02-27.txt:16:24:06: fizzie, there? Will jitfunge implement IMAP? :D +2009-03-06.txt:14:54:09: fizzie, will jitfunge implement IMAP? +2009-03-06.txt:14:54:51: I don't think so, no. I'm currently again in the hibernationary "collecting motivation" stage re jitfunge. +2009-03-08.txt:13:47:22: Deewiant, it is a bad idea to make sarcastic comments about cfunge. You won't have anything left to say for jitfunge then +2009-03-08.txt:14:01:16: Deewiant, anyway I think fizzie isn't working on jitfunge currently +2009-03-10.txt:14:53:58: ehird, well it all depends on what you are using it for. fungot running a slow underload interpreter? cfunge or in the future jitfunge +2009-03-10.txt:14:58:32: Deewiant, no, jitfunge is more optimised +2009-03-10.txt:14:58:40: uh, jitfunge doesn't optimize. +2009-03-10.txt:14:59:29: ehird, well depending on what you mean, nor does cfunge. jitfunge could potentially. Just in cfunge I tried to write all the C code fast. But I don't try to constant fold code. Like jitfunge does +2009-03-10.txt:14:59:38: so I'd say jitfunge is more optimising +2009-03-10.txt:14:59:55: jitfunge is broken, though; that's a disadvantage. +2009-03-11.txt:14:21:19: -!- rabideejit has joined #esoteric. +2009-03-11.txt:14:21:46: Greeting. +2009-03-11.txt:14:23:02: I have a new language for you. +2009-03-11.txt:14:23:02: Consider deciphering the contents of http://esoteric.voxelperfect.net/wiki/Kolmogorov and http://www.killersmurf.com/projects/Kolmogorov +2009-03-11.txt:14:25:14: yes +2009-03-11.txt:14:25:29: Aaah! It's you. You inspired me. +2009-03-11.txt:14:25:40: greetings, rabideejit +2009-03-11.txt:14:26:01: indeed you did. +2009-03-11.txt:14:26:22: Greetings ais. +2009-03-11.txt:14:27:12: Yes. The andrei machine is much closer to what Kolmogorov had in mind, I'd say. +2009-03-11.txt:14:30:21: Ah, thankyou! I was looking for that. My source was Uri Gurevich's on Kolmogorov Machines and Related issues. +2009-03-11.txt:14:37:03: Ah, the Andrei machine has an easy-to-reach register. The challenger of the Kolmogorov language is all your data is all pointing to each other and you get lost. Hence the 500 line 99 bottles of beer. +2009-03-11.txt:14:37:11: *challenge +2009-03-11.txt:14:37:24: Freudian there. +2009-03-11.txt:14:37:54: Ah but I guess the Andrei register is a bit hard to reach, as you have to run through the graph to get it. +2009-03-11.txt:14:38:24: Indeed. +2009-03-11.txt:14:38:52: It would be crazy. +2009-03-11.txt:14:39:48: Yes, it seems a very complex problem. +2009-03-11.txt:14:40:23: Hmmmmm! +2009-03-11.txt:14:44:02: hum, ho. +2009-03-11.txt:14:45:58: I must take your leave, I need to eat some yogurt. Nice to meet you Slereah. +2009-03-11.txt:14:46:19: -!- rabideejit has quit ("Leaving."). +2009-03-11.txt:18:08:55: our JIT is about 20x faster than CPython. ]] +2009-03-12.txt:11:54:53: I'm eagerly waiting for the next C++ sprint, I could actually work on jitfunge a bit at that point. +2009-03-12.txt:11:54:56: jitfunge was in C++ right? +2009-03-12.txt:12:09:43: I would like to work on jitfunge more if there wasn't that damned self-modification going on. I can't even compile a constant-argument p into a simple memory store, without worrying that later the jitter is going to create a compiled trace at that location, and it will then be invalidated if this particular p instruction is ever executed. +2009-03-12.txt:12:15:17: fizzie, also if there wasn't self modification you could just compile it normally without needing JIT +2009-03-12.txt:12:16:00: Given the funky Befunge code-flow, that's not completely trivial either; I'd still suspect a tracing JIT could be the way to go. It'd just be a lot easier. +2009-03-12.txt:12:16:56: Currently jitfunge has a "solution" which basically boils down recording in funge-space all the cells where any compiled-to-memory-store-puts refer to, and later if we end up executing code in such a place, invalidating the referring code. +2009-03-12.txt:14:59:34: Fujitsu Siemens is partially German (the Siemens side, surprisingly) and they manufacture computers. +2009-03-12.txt:16:01:54: 10:54 fizzie: I'm eagerly waiting for the next C++ sprint, I could actually work on jitfunge a bit at that point. +2009-03-12.txt:16:01:55: 10:54 AnMaster: jitfunge was in C++ right? +2009-03-13.txt:22:39:14: Deewiant, well, I like NX. It is actually useful. Only JITs need to disable it really. +2009-03-13.txt:22:41:14: Deewiant, any non-esoteric examples of self modifying code? Apart from JITs that is. +2009-03-15.txt:19:41:59: ehird, unknown. It depends on what you do. If fizzie finishes his jitfunge he will beat cfunge at single-threaded apps +2009-03-15.txt:19:42:55: fizzie, how do you plan to implement t in jitfunge? +2009-03-15.txt:19:44:09: I don't really have a plan there; I don't see any sensible way of doing synchronous threads with jitfunge without it being completely brainless. +2009-03-15.txt:21:52:42: I just found something useful for you in jitfunge +2009-03-15.txt:22:02:39: The instructions they mention are SSE-only, though. And it'll be a while before I get to the optimizationary stage with jitfunge, you may need to remind me about that later. +2009-03-15.txt:22:05:22: The code generated by jitfunge is pretty sucky. +2009-03-15.txt:22:11:32: I'll have to think about the llvm thing. Currently the jitfunge code is pretty convoluted; maybe if I cleaned it up a bit so that there'd be a clean-ish-er interface between code generation and the rest of the code, I could even experiment better. +2009-03-15.txt:22:23:59: fizzie, Concurrent JIT +2009-03-15.txt:22:42:15: I know a precious little about x86 low-level details for a JIT-writer. +2009-03-24.txt:22:15:24: well JIT then +[too many lines; stopping]