RSS

Tag Archives: Technology

Source of North Korea ICBM? Putin’s Bitch’s Master

One of the questions about North Korea’s missile program is how exactly did the improve from basically short range missiles capable of reaching Japan, to true ICBMs capable of reaching the East Coast of the US in a little more than a year?

Turns out the answer to that question isn’t aid from China. It is Missile Technology given to the North Koreans by the Chumph’s butt-buddy, Putin.

Putin has transferred SS-18/19 capability to the North Koreans. More than likely, the Russians have also given the North Koreans the miniaturized W-31 Warhead capability, first  stolen from the US during the Reagan Administration. Below is a graphic of potential North Korean Nuclear targets captured by the Washington Post..

If the Chumph wasn’t so busy sucking Putn’s man parts, he would have the courage to cut off the NK Missile pipeline.

 

To understand how “rapid” this missile “development” has gone , the following with the leftmost red line being the NK launch on July 4th of this year, and the rightmost red line being the July 28th launch. –

 

NORTH KOREA’S NEW MISSILES CAME FROM UKRAINE AND RUSSIA, REPORT CLAIMS

The speed at which North Korea has ramped up its missile and nuclear defense programs within the last two years is reportedly due to purchases Kim Jong Un’s regime has made on a weapons black market linked to the Ukraine and Russia as the United States and the globe frets over a potential military conflict.

A new report released Monday by the International Institute for Strategic Studiesexplained the North has made “astounding strides” in missile development and explained it could not have done so without a high-performance liquid-propellant engine, or LPE, provided by a “foreign source.”

“Claims that the LPE is a North Korean product would be more believable if the country’s experts had in the recent past developed and tested a series of smaller, less powerful engines, but there are no reports of such activities,” the report, penned by missile expert Michael Elleman, read.

Citing available evidence, which can be sparse due to the secretive ways of the North and its isolation from the rest of the world, the report states that North Korea’s ability to jump from short- and medium-range missiles and a flawed type of intermediate-range missile to a more advanced and successful intermediate Hwasong-12 and an intercontinental ballistic missile, called Hwasong-14, could only have occurred with an LPE related to the Soviet RD-250 engines.

Stating that it was “less likely” that Russian engineers could have directly worked on the North’s missiles, the conclusion is drawn that the Soviet Rd-250 missiles and the requisite experience with that class of missile stemmed from factories either from the top Russian rocket engine manufacturer Energomash or the Ukraine’s KB Yuzhnoye.

“One has to conclude that the modified engines were made in those factories,” the report read.

The latter company has a factory based in Dnipro, Ukraine, located inside a part of the country attempting to break away and join Russia amidst a military conflict, and U.S. intelligence agencies believe the Soviet rockets at use by the North were likely made there as the state-owned factory has struggled, The New York Times reported.

Also, back in 2011, North Koreans were caught attempting to steal missile secrets from the factory and that the North may have tried to infiltrate the factory another time.

The rocket engines also are believed to be the very ones the North used to test two missiles last month, which has led to more threats from Kim and calls for diplomacy by China—the North’s sole ally—and even threats from U.S. President Donald Trump.

 

Tags: , , , , , , , , ,

Speaking of Failing Systems

In Europe and probably China – the train system has a number of safety controls which prevent operators from…Driving a train into a platform. Or, as in the case of the wreck in Philadelphia last year, going around a turn at speeds in excess of what the train can handle.

This is all accomplished by the existence of a digital radio system, which monitors and manages the train for unsafe conditions.

The reason we don’t have this is in part the FCC, and in second part Congress, which is often paid by special interests to suppress technology.

So we just get along with what happened today…Again

So far in 2016 – 11 of the rail crashes in the entire world have been in the United States. That’s 22% of all the crashes.

Train Crashes Into Hoboken Terminal, Killing 1, Injuring Scores Of People

A commuter train in New Jersey crashed into Hoboken Terminal in New Jersey on Thursday morning, resulting in multiple injuries and visible structural damage.

One person was killed and at least 65 people were injured, according to Gov. Andrew Cuomo of New York. New Jersey transit officials said at least 100 people were hurt, Stephen Nessen of WNYC reports.

Joseph Scott, the CEO of Jersey City Medical Center, said the hospital had admitted some victims in critical or serious condition.

Gov. Chris Christie of New Jersey told MSNBC and CNN shortly before noon Eastern that everyone trapped on the train has been rescued or removed, and that there was no indication that the crash was anything but an accident.

The train that crashed originated in Spring Valley, N.Y., and “collided with the platform” at Hoboken Terminal at around 8:45 a.m. Eastern, Cuomo said in a statement.

Nessen reports that passengers on from the train that crashed say it approached the station at “full speed” before slamming into the barrier at the end of the track, with the impact throwing riders onto the floor.

WNYC’s Nancy Solomon arrived on the scene shortly after the crash, and says she personally saw 20 to 30 injured people, including at least four who were unable to walk.

She says a train had traveled “all the way into the station — not into the waiting room but into the outdoor part where people transfer.”

 
1 Comment

Posted by on September 29, 2016 in American Greed

 

Tags: , , , , ,

Deep Sea Human Robot

Currently human deep sea divers can work under the sea at depths of 450′. A few, very specialized dives can work as deep as 750′ for very short periods.

This technology is revolutionary as it lets a remote human operator, with electronic feedback that lets them actually feel objects the submersible robot is holding and see the ocean around them as if they were there. The prototype is capable of operating at 300 Meters, or close to 1,000 ft deep, and undoubtedly as the technology is further developed it will likely go much deeper. This is the very leading edge of robotic, Artificial Intelligence, and Virtual Reality Technology – and fundamentally changes the game in terms of underwater exploration, and eventual undersea habitation.

 

 
Leave a comment

Posted by on April 29, 2016 in General

 

Tags: , , , , ,

Why Black Folks Need Encryption

The battle between Apple Computer and the FBI is not just about the encrypted messages on one iPhone. What it is really about is expanding the legal ability of the Federal Government to spy on it’s citizens.  This has been an ongoing battle largely behind the scenes between tech developers and Government agencies such as NIST and the NSA. Indeed in 1988 the Government forced standards groups to make TCP/IP the telecom standard upon which the Internet is based, and rejected the OSI standard in large part because the OSI Standard included encryption that was difficult to break, and features which allowed the development of advanced security technologies at the “transmission” level, whereas TCP had none, and was easily compromised by government spooks, who had been using and testing the vulnerabilities of the protocol stack for years. The Internet is not secure, and by design – can never be secured, which is why the Government defense and intelligence agencies use a custom variation of TCP/IP which incorporates capabilities to enhance secure communication.

There are roughly 7 major Commercial Off The Shelf encryption systems in use today. In order to sell those in the US, NSA requires that security “backdoors” be built in such that the systems may be easily compromised by the NSA, and we would assume law enforcement agencies. This requirement does not cover, and has not covered to this point encryption systems on devices which store information internally in flash memory or disk drives. As such manufacturers and equipment owners have been free to encrypt an secure their data by any means they deem appropriate.

There is no such thing as unbreakable encryption. Even that used by the most secret agencies in the Government isn’t unbreakable. What it is is a system which is difficult enough to break that only another government or massive corporation has access to the computer horsepower and scientists to do so – which costs lots and lots of money. So it is actually a race to develop new, more complex systems as the previous system is broken. And I can tell you from personal experience that two guys in a garage that against odds come up with something new and radical which might give those systems heartburn – are in for a visit by the guys in black suits shortly after letting the world know you’ve developed it.

Now – most of what you see on TV about the technology is bullshit. You are not breaking into a unknown secured local network just in time for the hero to do his thing with your trusty laptop. There is no such thing as a 100% secure network, if it has any connection at all to the Internet. The Internet of Things (IoT) which is a hot-button meme right now in the industry isn’t going to happen, because the security in wireless systems is so poor (on purpose and by design).

So…If you are using commercial off the shelf products like cell phones and computers – you can’t stop them from listening in. What you can do is make it difficult enough that unless somebody like the NSA comes after you, you have security. Which is exactly what Apple did.

Here’s why civil rights activists are siding with the tech giant.

Last night, the FBI, saying that it may be able to crack an iPhone without Apple’s help, convinced a federal judge to delay the trial over its encryption dispute with the tech company. In February, you may recall, US magistrate judge Sheri Pym ruledthat Apple had to help the FBI access data from a phone used by one of the San Bernadino shooters. Apple refused, arguing that it would have to invent software that amounts to a master key for iPhones—software that doesn’t exist for the explicit reason that it would put the privacy of millions of iPhone users at risk. The FBI now has two weeks to determine whether its new method is viable. If it is, the whole trial could be moot.

That would be a mixed blessing for racial justice activists, some of them affiliated with Black Lives Matter, who recently wrote to Judge Pym and laid out some reasons she should rule against the FBI. Theletter—one of dozens sent by Apple supporters—cited the FBI’s history of spying on civil rights organizers and shared some of the signatories’ personal experiences with government overreach.

“One need only look to the days of J. Edgar Hoover and wiretapping of Rev. Martin Luther King, Jr. to recognize the FBI has not always respected the right to privacy for groups it did not agree with,” they wrote. (Targeted surveillance of civil rights leaders was also a focus of a recent PBS documentary on the Black Panther Party.) Nor is this sort of thing ancient history, they argued: “Many of us, as civil rights advocates, have become targets of government surveillance for no reason beyond our advocacy or provision of social services for the underrepresented.”

Black Lives Matter organizers have good reason to be concerned. Last summer, I reported that a Baltimore cyber-security firm had identified prominent Ferguson organizer (and Baltimore mayoral candidate) Deray McKesson as a “threat actor” who needed “continuous monitoring” to ensure public safety. The firm—Zero Fox—briefed members of an FBI intelligence partnership program about the data it had collected on Freddie Gray protest organizers. It later passed the information along to Baltimore city officials.

Department of Homeland Security emails, meanwhile, have indicated that Homeland tracked the movements of protesters and attendees of a black cultural event in Washington, DC, last spring. Emails from New York City’s Metropolitan Transit Authority and the Metro-North Railroad showed that undercover police officers monitored the activities of known organizers at Grand Central Station police brutality protests. The monitoring was part of a joint surveillance effort by MTA counter-terrorism agents and NYPD intelligence officers. (There are also well-documentedinstances of authorities spying on Occupy Wall Street activists.)

In December 2014, Chicago activists, citing a leaked police radio transmissionalleged that city police used a surveillance device called a Stingray to intercept their texts and phone calls during protests over the death of Eric Garner. The device, designed by military and space technology giant Harris Corporation, forces all cell phones within a given radius to connect to it, reroutes communications through the Stingray, and allows officers to read texts and listen to phone calls—as well as track a phone’s location. (According to theACLU, at least 63 law enforcement agencies in 21 states use Stingrays in police work—frequently without a warrant—and that’s probably an underestimate, since departments must signagreements saying they will not disclose their use of the device.)

In addition to the official reports, several prominent Black Lives organizers in Baltimore, New York City, and Ferguson, Missouri, shared anecdotes of being followed and/or harassed by law enforcement even when they weren’t protesting. One activist told me how a National Guard humvee had tailed her home one day in 2014 during the Ferguson unrest, matching her diversions turn for turn. Another organizer was greeted by dozens of officers during a benign trip to a Ferguson-area Wal-Mart, despite having never made public where she was going.

In light of the history and their own personal experiences, many activists have been taking extra precautions. “We know that lawful democratic activism is being monitored illegally without a warrant,” says Malkia Cyril, director of the Center for Media Justice in Oakland and a signatory on the Apple-FBI letter. “In response, we are using encrypted technologies so that we can exercise our democratic First and Fourth Amendment rights.” Asked whether she believes the FBI’s promises to use any software Apple creates to break into the San Bernadino phone only, Cyril responds: “Absolutely not.”

“I don’t think it’s any secret that activists are using encryption methods,” says Lawrence Grandpre, an organizer with Leaders of a Beautiful Struggle in Baltimore. Grandpre says he and others used an encrypted texting app to communicate during the Freddie Gray protests. He declined to name the app, but said it assigns a PIN to each phone that has been approved to access messages sent within a particular group of people. If an unapproved device tries to receive a message, the app notifies the sender and blocks the message from being sent. Grandpre says he received these notifications during the Freddie Gray protests: “Multiple times we couldn’t send text messages because the program said there’s a possibility of interception.”

Cyril says “all of the activists I know” use a texting and call-encryption app calledSignal to communicate, and that the implication of a court verdict in favor of the FBI would be increased surveillance of the civil rights community. “It’s unprecedented for a tech company—for any company—to be compelled in this way,” Cyril says….

 

 
Leave a comment

Posted by on March 25, 2016 in BlackLivesMatter

 

Tags: , , , , , , , , ,

Howard University Opens Tech Incubator

Positive progress!

Not sure about the efficacy of the selected management group because they seem a bit light on the investment side…But that can be hopefully be reinforced and rectified downstream. Howard may want to look at George Masons “Century Club” s a model, or possibly generate a tech council in line with the “100” organizations in adjacent Montgomery and Fairfax Counties within the district to provide mentorship and coaching from experienced individuals. Second, as Howard also is one of the nation’s premiere med schools, and has a hell of an engineering department – I hope they will look at expanding horizons beyond just coding websites.

HOWARD UNIVERSITY, MAYOR MURIEL BOWSER SELECT LUMA LAB TO OPERATE TECHNOLOGY & INNOVATION INCUBATOR

Washington, D.C., (February 16, 2016)- Today, Howard University and Mayor Muriel Bowser selected Luma Lab to operate a new DC-based incubator on Howard’s campus. As the District’s first venture capital hub for start-ups and emerging companies, the incubator will support underrepresented entrepreneurs and businesses that provide innovative products and services to underserved communities.

“Homegrown innovation in the District is expanding, and our tech sector is growing,” said Mayor Bowser. “I remain committed to supporting our entrepreneurs and startups so that every resident has a chance to benefit from the innovation economy. And I am confident that this partnership with Howard University and Luna Labs will keep DC on the cutting edge.”

Luma Lab was chosen from an elite group of technology and entrepreneurship organizations to operate the innovation hub. This selection demonstrates the commitment by Mayor Bowser and Howard President Dr. Wayne A.I. Frederick’s to foster an inclusive technology and innovation industry in the District.

“Howard’s partnership with the District and Luma Lab exemplifies the University’s longstanding commitment to innovation,” said Howard President Dr. Wayne A.I. Frederick. “This incubator will support our social mission by creating opportunities for the next generation of minority innovators at Howard University and beyond.”

In addition to offering technology and entrepreneurship training, the hub will offer affordable co-working space, networking events, mentorship, and strategic connections to Silicon Valley, investors, and partners. The hub will also provide tiered services and programs to its member companies, Howard students, staff, and faculty, as well as the broader Shaw and DC communities.

The District contributed nearly $1 million in grant funds to construct over 8,000 square feet of cutting edge workspace within Howard’s Wonder Plaza retail center in the 2300 block of Georgia Avenue.

“It is both an honor and privilege to be selected to operate the tech incubator here in DC,” said Luma Lab CEO Aaron Saunders. “Working closely with Howard University and Mayor Bowser’s office, we will improve the technology landscape in Washington, DC by providing the underserved community with the right tools and access to technology partners and a seat at the digital table as successful creators, innovators, and entrepreneurs.”

The incubator is expected to launch in Fall 2016, with the possibility for early programming on Howard’s campus.

About Howard University
Founded in 1867, Howard University is a private, research university that is comprised of 13 schools and colleges. Students pursue studies in more than 120 areas leading to undergraduate, graduate and professional degrees. Since 1998, the University has produced two Rhodes Scholars, two Truman Scholars, a Marshall Scholar, 30 Fulbright Scholars and 11 Pickering Fellows. Howard also produces more on campus African-American Ph.D. recipients than any other university in the United States. For more information on Howard University, call 202-238-2330, or visit the University’s Web site at www.howard.edu.

About Luma Lab
Luma Lab is the tech education arm of minority owned Clearly Innovative, Inc. Luma Lab exposes students to the tech community through programming that engages and teaches an array of technical and entrepreneurial skills. Luma Lab provides education, mentorship and a positive learning environment for children, youth and adults. Our program exposes students to all facets of technology including lean startup principles, user experience, software development, and product management. Luma Lab is the first Washington DC recipient to be awarded the Chase Mission Street Grant for its dedication to nurturing the next generation of underserved coders, programmers, developers and tech gurus. For more information or sponsorship inquiries, please visit www.luma-lab.com/innovationhub or call 202-408-7514, or visit Luma Lab’s website atwww.luma-lab.com.

 
 

Tags: , , , , ,

Academic Steering and Black Students

The median pay for a person in my business with (and sometimes without) a Bachelors in Computer Science or Information Technology and with a Manufacturer certification such as a JCIE/CCIE, or a cyber-security cert such as a CISSP is $120,000- $140,000 a year. No PhD required.

So WTF are you taking a degree track in basket weaving?

That same sort of math applies across several STEM based fields, including the Medical Technology industry, Chemical Engineering, some Aerospace, and other Hi-Tech areas. And yes – you have to work your ass off to get there unless you are one of those natural-born geniuses.

So tell me again, why are you enrolled in a under-graduate program where the salary average is 1/3 of that. Despite the “Diversity problems” on the left coast, there are literally tens of thousands of other jobs in the rest of the country.

About 10 percent of black computer science professors and Ph.D. students nationwide are at Clemson, thanks in large part to the work of one professor. Click pic to link.

 

How US academia steers black students out of science

When the late Justice Antonin Scalia pointed out last year that “it does not benefit African-Americans to get them into the University of Texas [Austin] where they do not do well, as opposed to having them go to a less-advanced school, a less — a slower-track school where they do well,” he was roundly criticized by the left as a racist.

He was alluding, of course, to the “mismatch” problem that occurs when black students who are less qualified are admitted to more selective schools but do not graduate or do well at them as a result. Two recent studies, though, suggest that his words are truer now than ever.

The first comes from the Georgetown Center on Education and the Workforce, which found that black students are less likely to pursue lucrative majors than their white peers. According to the report, “African Americans account for only 8 percent of general engineering majors, 7 percent of mathematics majors, and only 5 percent of computer engineering majors.”

But they’re overrepresented in fields that don’t have high salaries: “21 percent in health and medical administrative services, compared to only 6 percent in the higher-earning detailed major of pharmacy, pharmaceutical sciences, and administration.”

Finally, it noted, “They are also highly represented in . . . [the low-paying fields of] human services and community organization (20%) and social work (19%).”

“There’s a huge inadequacy here in counseling,” Anthony Carnevale, director of the center and the lead author of the report, told the Atlantic.

This seems pretty unlikely. Who doesn’t realize computer engineers get paid well? The real problem is that too many black students are getting a hopelessly inadequate K-12 education and by the time they get to college, their best bet is to major in a subject whose exams have no wrong answers and whose professors engage in rampant grade inflation.

Carnevale also argues that’s because blacks are concentrated in open-access schools that have fewer choices of majors. But this, too, is questionable. Plenty of open-access universities offer courses and majors in STEM fields.

The implication is that black students at lower-tier universities are actually less likely to graduate in STEM majors than those at higher-tier ones. Which is patently false. Indeed, the historically black colleges and universities, many of which aren’t selective at all, tend to have among the highest rates of graduating STEM majors.

And if you want to get a job in a lucrative STEM field, your chances of completing your degree are much better at a lower-tier school. But here’s the real kicker: A recent survey by the Wall Street Journal found that in “fields like science, technology, engineering and math, it largely doesn’t matter whether students go to a prestigious, expensive school or a low-priced one — expected earnings turn out the same.”

For instance, if you go to Manhattan College, where the average SAT score is around 1620, and major in engineering, your mid-career median pay will be $140,000. If you go to Rice, where the average SAT score is 2180, and major in engineering, your pay will be $145,000.

In other words, there’s not much upside financially to going to the more elite schools. But there is a huge downside: Your chances of graduating with a degree in that major fall dramatically.

If you want to know why there’s still a big salary difference for kids majoring in humanities and social sciences between elite and non-elite schools, it probably has something to do with the substance of the major.

Since most employers have no idea what you learned in your sociology classes, they’ll just assume the kids who went to Harvard are smarter.

But they’ll know exactly what you learned in your math and science classes and so they’ll compensate you well if you did reasonably well no matter where you took them.

If liberal elites really were concerned about increasing the graduation rates and career earnings of minority students, they would realize that the Ivy League is not the answer.

And forget Scalia’s racism about elite schools (UT Austin ain’t an “elite school” on the level of MIT, Stanford, or Cal Tech – although it is a good school). Nobody gives a good damn about your GPA 3 years after you graduate – they care about “what can you do for me”. While graduation from an elite school gets you a higher starting salary, which really doesn’t disappear until late career (HR in many companies never corrects that fact, leading to higher turnover of top performers from “lesser schools” and can’t figure out why their best programmer Jimmy with a degree from Downstate U quit to take a new job, while Wilberforce form Big-Name U, an average performer, stays ) – you are still making money putting you in the top 2-3% of wage earners in the US. The folks that failed at that math are generally working in HR at less than half that – and all too often don’t have a clue.

 
 

Tags: , , , , , , , , , ,

You Can See Me Now – In The Movies

Back before digital photography, the Film used in professional level cameras had distinct qualities in terms of color rendition. Certain types of Kodak tended towards blue, others were “warm” – enriching the reds and yellows. This meant if you were shooting anything with blue, the sky for instance – the rendition was spectacular. Browns and greens tended to be “muddy” and tonal quality – the differentiation between something with multiple greens for instance – tended to wash out into a “middling” color instead of the full spectrum. Fuji Film tended towards yellow, and produced really vibrant greens and, to a lesser extent browns…

Ergo – getting film to “see” black folks, or even render the plethora of skin tones was difficult, if not impossible. Getting fine detail was virtually impossible for darker skin tones.

Since similar film formulations were used to make movies – black folks just all came out as the same color – if you could see an detail at all.

‘12 Years a Slave,’ ‘Mother of George,’ and the aesthetic politics of filming black skin

In one of the first scenes of early Oscar favorite “12 Years a Slave,” the film’s protagonist, Solomon Northup, played by Chiwetel Ejiofor , is seen at night, sleeping alongside a fellow enslaved servant. Their faces are barely illuminated against the velvety black background, but the subtle differences in their complexions — his a burnished mahogany, hers bearing a lighter, more yellow cast — are clearly defined.

Mother of George,” which like “12 Years a Slave” opens on Friday, takes place in modern-day Brooklyn, not the candlelit world of 19th-century Louisiana. But, like “12 Years a Slave,” its black stars and supporting players are exquisitely lit, their blue-black skin tones sharply contrasting with the African textiles they wear to create a vibrant tableau of textures and hues.

“Mother of George” and “12 Years a Slave” are just the most recent in a remarkable run of films this year by and about African Americans, films that range in genre from the urban realism of “Fruitvale Station” and light romantic comedy of “Baggage Claim” to the high-gloss historic drama of “Lee Daniels’ The Butler” and the evocatively gritty pot comedy “Newlyweeds.” The diversity of these films isn’t reflected just in their stories and characters, but in the wide range of skin tones they represent, from the deepest ebonies to the creamiest caramels.

The fact that audiences are seeing such a varied, nuanced spectrum of black faces isn’t just a matter of poetics, but politics — and the advent of digital filmmaking. For the first hundred years of cinema, when images were captured on celluloid and processed photochemically, disregard for black skin and its subtle shadings was inscribed in the technology itself, from how film-stock emulsions and light meters were calibrated, to the models used as standards for adjusting color and tone.

That embedded racism extended into the aesthetics of the medium itself, which from its very beginnings was predicated on the denigration and erasure of the black body. As far back as “The Birth of a Nation” — in which white actors wearing blackface depicted Reconstruction-era blacks as wild-eyed rapists and corrupt politicians — the technology and grammar of cinema and photography have been centered on the unspoken assumption that their rightful subjects would be white.

The result was that, if black people were visible at all, their images would often be painfully caricatured (see Hattie McDaniel in “Gone With the Wind”) or otherwise distorted, either ashy and washed-out or featureless points of contrast within the frame. As “12 Years a Slave” director Steve McQueen said in Toronto after the film’s premiere there, “I remember growing up and seeing Sidney Poitier sweating next to Rod Steiger in ‘In the Heat of the Night,’ and obviously [that was because] it’s very hot in the South. But also he was sweating because he had tons of light thrown on him, because the film stock wasn’t sensitive enough for black skin.”

Montré Aza Missouri, an assistant professor in film at Howard University, recalls being told by one of her instructors in London that “if you found yourself in the ‘unfortunate situation’ of shooting on the ‘Dark Continent,’ and if you’re shooting dark-skinned people, then you should rub Vaseline on their skin in order to reflect light. It was never an issue of questioning the technology.” In her classes at Howard, Missouri says, “I talk to my students about the idea that the tools used to make film, the science of it, are not racially neutral.”

Missouri reminds her students that the sensors used in light meters have been calibrated for white skin; rather than resorting to the offensive Vaseline solution, they need to manage the built-in bias of their instruments, in this case opening their cameras’ apertures one or two stops to allow more light through the lens. Filmmakers working with celluloid also need to take into account that most American film stocks weren’t manufactured with a sensitive enough dynamic range to capture a variety of dark skin tones. Even the female models whose images are used as reference points for color balance and tonal density during film processing — commonly called “China Girls” — were, until the mid-1990s, historically white.

In the face of such technological chauvinism, filmmakers have been forced to come up with workarounds, including those lights thrown on Poitier and a variety of gels, scrims and filters. But today, such workarounds have been rendered virtually obsolete by the advent of digital cinematography, which allows filmmakers much more flexibility both in capturing images and manipulating them during post-production.

Cinematographer Anastas Michos recalls filming “Freedomland” with Julianne Moore and Samuel L. Jackson, whose dramatically different complexions presented a challenge when they were in the same shot. “You had Julianne Moore, who has minus pigment in her skin, and Sam, who’s a dark-skinned guy. It was a photographic challenge to bring out the undertones in both of them.”

Michos solved the problem during a phase of post-production called the digital intermediate, during which the film print is digitized, then manipulated and fine-tuned. “You’re now able to isolate specific skin tones in terms of both brightness and color,” says Michos, who also shot “Baggage Claim,” “Jumping the Broom” and “Black Nativity,” due out later this year. “It gives you a little bit more flexibility in terms of how you paint the frame.”

Daniel Patterson, who shot “Newlyweeds” on a digital Red One camera, agrees, noting that on a recent shoot for Spike Lee’s “Da Blood of Jesus,” he was able to photograph black actors of dramatically different skin tones in a nighttime interior scene using just everyday house lamps, thanks to a sophisticated digital camera. “I just changed the wattage of the bulb, used a dimmer, and I didn’t have to use any film lights. That kind of blew me away,” Patterson says. “The camera was able to hold both of them during the scene without any issues.”

The multicultural realities films increasingly reflect go hand in hand with the advent of technology that’s finally able to capture them with accuracy and sensitivity. And on the forefront of this new vanguard is cinematographer and Howard University graduate Bradford Young , the latest in a long line of Howard alums — including Ernest Dickerson, Arthur Jafa and Malik Sayeed — who throughout the 1990s deployed the means of production to bring new forms of lyricism, stylization and depth to filmed images of African Americans….

 
Leave a comment

Posted by on October 18, 2013 in Black History, The Post-Racial Life

 

Tags: , , , , , , , , , ,

Best Educated Janitors…

Fresh on news that there are 21 million Americans out of work – there is the question of the undremployed–

Why Did 17 Million Students Go to College?

Two sets of information were presented to me in the last 24 hours that have dramatically reinforced my feeling that diminishing returns have set in to investments in higher education, with increasing evidence suggesting that we are in one respect “overinvesting” in the field. First, following up on information provided by former student Douglas Himes at the Bureau of Labor Statistics (BLS), my sidekick Chris Matgouranis showed me the table reproduced below (And for more see this).

Over 317,000 waiters and waitresses have college degrees (over 8,000 of them have doctoral or professional degrees), along with over 80,000 bartenders, and over 18,000parking lot attendants. All told, some 17,000,000 Americans with college degrees are doing jobs that the BLS says require less than the skill levels associated with a bachelor’s degree.

Now I’ve said for a while that one the the great myths of the new depression is the existence of high tech jobs needing high education. At this point there are millions of college educated out of work or substantially underemployed. You cannot fix the roots of the current economic malaise by by generating more job seekers – no matter how well educated or qualified. The brutal fact is, very little of our current economy is actually dependent on new technology. Think of it this way – the leading cell phone platform is dependent on thinking and aa technology concept first developed in Xerox Labs in the 70’s. Very little of the development today of “new technology” is actually “development’ = it is actually execution against old technology. So if you trin them – what would this new legion of scientists and engineers do?

And there is the crux of the problem.

 
2 Comments

Posted by on January 8, 2012 in Great American Rip-Off, The Post-Racial Life

 

Tags: , , , , , , , ,

A Faster Internet?

I see MIT has finally caught up with some of the work being done in the late 90’s on Flow Switching, a technology variant of optical switching.  These are not “routers” as described in the article, neither can the be made to operate utilizing the current TCP/IP variants in the Internet today as anything except “passengers” within the core structure.

I don’t believe PC speed drives this. The reason is the software overhead of bloated functions is the principal consumer of processor power in consumer PCs – not network functions. Nor does file size drive the need. What drives it is the evolution of personal and home “cloud” type computing wherein you have clusters of smart, and not smart devices performing various functions in the home and business which interact with each other and data sources on the internet. Instead of 1 or 2 computing devices per person, you evolve to 10 or 12.

Fascinating though…

MIT Researchers Promise an Internet That’s 100x Faster and Cheaper

MIT researchers have developed technology that they say not only will make the Internet 100 to 1,000 times faster, but also could make high-speed data access a lot cheaper.

The trick to such dramatic performance gains lies within routers that direct traffic on the Internet, according toVincent Chan , an electrical engineering and computer science professor at MIT, who led the research team. Chan told Computerworld that replacing electrical signals inside the routers with faster optical signals would make the Internet 100, if not 1,000 times faster, while also reducing the amount of energy it consumes.

What would the Internet be like if it ran that much faster?

Today, a user who has a hard time downloading a 100MB file would be able to easily send a 10GB file, with the Internet running 100 times faster, according to Chan.

“We’re looking to the future when computer processors are much more powerful and we have much bigger downloads and applications,” Chan said. “When we get more powerful processors, people will be clamoring for more speed. The question is, can these new processors and their powerful applications be supported over the Internet? Everyone will be using more high-rate applications, like 3D, interactive games, high-speed financial trading.”

And when that happens, Chan said users of those large applications will run intochoke points on the Internet. And that could happen as soon as 16-core processorshit the market, if not sooner. “I think the Internet will not be fast enough within three to five years,” he added.

The answer, he said, is optical fibers, which carry light pulses.

Optical fibers are used widely on the Internet, spanning great distances and even continents. While they transmit information more efficiently than electrical signals, optical signals are complicated to deal with. A router, for instance, has problems handling optical signals coming from different directions at the same time. To get around that problem, routers on the Internet generally take in optical signals and convert them to electrical signals so they can be stored in memory until they can be processed, said MIT’s report. After that, the electrical signals are converted back to optical signals so they can be sent back out.

That process eats up chunks of time and energy. Chan and his team have developed technology that would eliminate the need for such conversions.

Chan’s architecture, which is called “flow switching,” establishes a dedicated path across the network between locations that exchange large volumes of data — from Silicon Valley to Boston, for instance. MIT explained that routers along that path would only accept signals coming from one direction and send them off in only one direction. Since the optical signals aren’t coming from different directions, there’s no need to convert them to electrical signals for storage in memory.

“If this can truly jack up Internet data speeds by 100 times, that would have a huge impact on the usability of the Net,” said Dan Olds, an analyst at Gabriel Consulting Group Inc. “We’d see the era of 3D computing and fully immersive Internet experiences come much sooner…. If this turns out to be practical, it could be a very big step forward.”

Dealing with network bottlenecks would be a huge accomplishment, said Rob Enderle, principal analyst at Enderle Group.

“Right now, the network is the bottleneck for hosted computing. This change could transform the industry as we know it,” said Enderle. “We are going to need a faster Internet. We need it now. Currently, we only have about 20% [of available bandwidth] in many places.”

 
1 Comment

Posted by on July 3, 2010 in News

 

Tags: , , ,

 
%d bloggers like this: