Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 01/04/2013 in all areas

  1. Dyer is a known hoaxer, a racist and a sociopath, and much like those who forgave Biscardi after numerous proven hoaxes, there will be those that defend him out of a hope to prove the exsistance of bigfoot, over common sense. This is not to say that I don't believe that bigfoot might exsist, what I am saying is that giving Dyer any second chance, exemptions from his past or even a grain of confidence, will give him (or Biscardi) the motivation to continue. Dishonesty is a characteristic and a personality trait (in my opinion), you cannot turn it on and off and while one may not always tell lies, fabrictions or stretch the truth, the foundation for that particular character trait is always there. You don't see Matt Whitton defending his friend Ricky, quite the opposite. http://www.thecryptocrew.com/2011/12/matt-whitton-of-georgia-hoax-talks.html http://www.thecryptocrew.com/2012/05/rick-dyer-releases-photos-of-bigfoot.html All you have to do is visit Dyers website to see a sociopaths thought process in action, after all why be defensive while telling a lie, when it is much easier to muddy the waters surrounding a lie by going on the offensive. Classic sociopath behavior. http://bigfootevidence101.blogspot.com/ And just to drive the point home as to what sort of "man" dyer is. http://bigfootevidence.blogspot.com/2011/12/what-is-rick-dyers-mugshot-doing-on.html http://www.bigfootlunchclub.com/2011/01/bigfoot-hoaxer-rick-dyer-arrested-for.html And my favorite http://squatchdetective.wordpress.com/2012/12/26/since-everyone-is-pointing-out-an-old-arrest-of-dyer-let-me-point-out-something-rather-new/ If he were on fire, I personally, consider my urine to be of far more worth than him, and would not see it wasted by putting out that particular fire. GoLd
    2 points
  2. Tyler, to be fair she did deny that there was anything paranormal in her paper a few weeks ago on FB: I really think we should absolutely stop using RL for a source of information even if he is a blind squirrel that occasionally finds a nut. He also just rips off speculation from posts here and and repeats it as "inside information." In fact if you go back to when the "Angel DNA" comments first appeared on RL's blog, it was right after there were some oblique "Nephilim" comments in the "Ketchum Report" thread.
    2 points
  3. Okay, let me see, if I can illustrate this in a straight forward way. Hopefully, I don't get tripped up by formatting issues. Here is a portion of non-coding sequence (that is to say it does not contain information that determines the amino acid sequence of the protein) from brewer's yeast cytochrome b. 5'ATTTTCTCCGTATTCATTATTATATTATCTAATTTATAAAATATTTAAAGACTTATAATAATATAACATCTTTGTAAATTATTGTTAAAG3' 3'TAAAAGAGGCATAAGTAATAATATAATAGATTAAATATTTTATAAATTTCTGAATATTATTATATTGTAGAAACATTTAATAACAATTTC5' Let's say this is the sequence I have and I'd like to use it to detect wild yeasts out in the environment. So, I design my 2 primers based on sequence in red and blue. So long as there are exact or near exact matches between target and primer, I will detect. However let's say I have sample containing a yeast whose sequence looks like this at the red target due to a G:C basepair insertion in the middle: 5'CTCCGTATTCATTAGTTATATTATCTAAT...... 3'GAGGCATAAGTAATCAATATAATAGATTA....... That single insertion, which would result from a proofreading error in the DNA replication of some ancestor yeast, significantly disrupts the pairing of my primer and the target DNA. In this case I would not be able to detect the unknown wild yeast using my PCR assay. However, I would detect brewer's yeast or other yeasts with the same or similar sequence in my sample. GTATTCATTATTATATTATC-> 3'GAGGCATAAGTAATCAATATAATAGATTA....... A deletion (also due to a proof reading error in DNA replication) would cause a similar type of misalignment between primer and target. Hopefully, that helps. Two possibilities that I can think of: 1) the "special" primers may be the short primers that are used in the Next Generation Sequencing approach - they are short and will latch onto target DNA somewhat indescriminately so that just about everything in the mix gets amplified and then sorted out by genetic sequence information - and 2) she may have generated sequence information and then designed very specific primers that would allow her to do quick and easy assessments of incoming samples. I thought she has said that her sequences are different from known hominid (e.g. neanderthal and denisovan) sequences. And it isn't necessarily a matter of "extracting DNA where others fail," but actually amplifying the DNA where others were unable to because her primers are different. I think the key information that has come from MK is that she has a lot of sequence. Once you have the sequence sorted out and you begin comparing it to other sequences, things begin to fall into place.
    2 points
  4. FWIW, the first two online interviews that Ed Smith granted years ago were on Tom Biscardi's podcast. That being said; Ed isn't affiliated with Biscardi. He merely appeared on his internet "radio" show a few years back.
    1 point
  5. A recent show on PBS called Wolves in Wisconsin, broadcast first in 2012, had an interesting segment in it. It discussed how the photographer obtained the footage. Discussed how the wolves were EXTREMELY aware and wary of 'silent' trail cams, and would react to the photographers zoom lens at great distances. It took this photographer 4 years, full time in the field, to obtain 14 minutes, yes 14 minutes of clear footage. This, in an area where a population of wolves was known to exist. A bit of perspective there.....
    1 point
  6. As the great legal mind Cicero once commented: "When it comes to establishing the poor credibility of a witness, second only to the witness who impeaches himself, is the witness who opens his door a crack, but thereafter refuses to open it all the way."
    1 point
  7. Tyler, I think we can agree that so long as the belief system does not inhibit the presentation of facts, nor overly bog down the interpretation of the data, that we should be tolerant, if not respectful. We won't really know the situation until we see what she puts in her paper and her presentations. Anyway, I thought I'd come back to what I mentioned earlier: the idea that all three parties could in good faith report information that they know to be true, but can't possibly all be 100% true. So, for the sake of this discussion, let's accept that the following are true: JS gave portions of the same sample to MK and to Tyler/Bart who in turn provided to service labs. MK generated data that upon extensive analysis is indicative of an uncharacterized animal in the primate lineage. Service labs return results and interpretations indicating presence of material from known animals (includes human) only. A lot has been said concerning 1 and 2, so let me cut to the chase and focus on 3. How could mammalian universal primers fail to pick up an unknown mammal, if it's DNA was present in the sample? There are a number of reasons, but I can think of two that could be consistent with the data in Tyler's report. First of all, those results look very clean and appear to be relatively unambiguous. The lab has designed their assays with known entities in mind - they've gone and found all "relevant" DNA sequences, lined them up and designed their PCR primers to ensure broad detection. Hell! The DNA sequence is unambiguous...about what has been amplified. As far as they are concerned, they have their positive. It totally fits with their reality and their test design. Case closed. So, "how do you get around what should the predominant contributor, noticeably absent?" There are 2 types of disruption that will kill the PCR amplification of a target nucleic acid: 1) nucleotide insertions and 2) nucleotide deletions. Oligonucleotide primers (or just "primers" are designed to associate with a complementary DNA sequence of determined length. So, typically a researcher will look at a stretch of defined DNA sequence and design 2 primers (one commonly called the "forward primer" and one called the "reverse primer") based on a number of criteria that the researcher feels confident will lead to good performance in the assay. Let's say the researcher decides on primers that are 20 nucleotides in length and sends in an order to have them synthesized to specification. The primers are tested against known targets and they work as expected, i.e. they match perfectly or nearly perfectly with all 20 nucleotides each in the targets (DNA) of interest and give the expected results. Hurray! Now, it is true that primers can tolerate some kinds of mismatches in the specified sequence and still work - that's generally how universal primers work across a broad specturm of species. But insertions and deletions are difficult because they create unique physical constraints in the pairing of primers and target DNA. It will only take one insertion or deletion in either forward or reverse primer to pretty much kill the amplification of that specific DNA. So, if you have a mix where predominant (unknown) DNA has a bad match due to insertion/deletion you get no amplification, whereas the known matching DNA targets will amplify preferentially. The problem here is that insertions and deletions are also very disruptive to protein coding sequences, and there may be evolutionary pressures against such disruptions. However, disruptions in sequences outside of protein coding regions may be more forgiving (and therefore evolutionarily acceptable). I am not familiar with the design of these primers, so I can't assess the probability of this occuring here, but it is certainly possible. So, how might MK have gotten around this? Well, in her press release and interviews she has identified "Next Generation Sequencing" as the technology platform that has generated her most convincing data. Next Generation Sequencing is a relatively new technology platform that couples generic nucleic acid (DNA) amplification with high-throughput sequencing. It uses much shorter oligonucleotide primers to pretty much amplify all DNA in a sample for sequencing. The sequence data is rapidly fed into very powerful computer software that sorts and aligns the information. The view of the sample would be much more comprehensive and pitfalls due to insertions/deletions would be avoided or rendered insignificant due to massive amounts of corroborative information. It is a tremendous investigative tool, but would also have technical pitfalls of its own. I believe that some have indicated that this work was actually contracted out to labs with expertise and the appropriate instruments and computer systems. So, that is one possible explanation. However, there still seems to be a strong case for bear.
    1 point
This leaderboard is set to New York/GMT-04:00
×
×
  • Create New...