AD Quality Auto 360p 720p 1080p Top articles1/5READ MORERose Parade grand marshal Rita Moreno talks New Year’s Day outfit and ‘West Side Story’ remake With Mayor Antonio Villaraigosa pushing to reform Los Angeles’ public schools, Chick offered earlier this month to oversee a planned audit of the district’s administrative operations. Romer could not be reached for comment Tuesday. But last week, he said he opposed inviting Chick to audit the district – at an estimated cost of $800,000 – noting that he objected as much to the way Chick was demanding the audit as to the need for it. Chick, a Villaraigosa ally, has no authority to audit any LAUSD operations and must be asked to conduct such a review. Some school officials have questioned whether Chick’s audit is needed because several audits already have been conducted and others are planned. The district is about to engage in separate audits regarding its administration as part of an agreement with United Teachers Los Angeles and has planned a separate audit to be performed by the firm of KPMG. Chick has said she wants to conduct her own review to provide an independent report to the public on how the district is operating. In her request to Romer, Chick said she was particularly interested in audits that have been done by the U.S. Department of Education, Department of Agriculture, state Department of Education, state Controller’s Office, Bureau of State Audits, county Office of Education, Scholastic Audit Intervention Teams, Bond Oversight Committee and Office of Inspector General. School board member David Tokofsky said he welcomes Chick’s request and believes she should review reports even older than five years. “What she will be doing is ignoring perhaps the biggest decision of the school district – Belmont High School,” Tokofsky said. “If you want to look into the problems of the district, you have to start there.” Tokofsky, who chairs the school board’s audit committee, said he earlier had offered to provide Chick the information she is now seeking. “I had all the people who know about these things in my office and I was talking to her on the phone,” Tokofsky said. “I told her we could tell her anything she wanted to know before she made a decision to make an audit. She said she didn’t want to hear any of that.” Chick said she didn’t recall such an offer from Tokofsky, but welcomes any information he or the district can provide. Rick Orlov, (213) 978-0390 firstname.lastname@example.org 160Want local news?Sign up for the Localist and stay informed Something went wrong. Please try again.subscribeCongratulations! You’re all set! Stepping up her campaign to audit the Los Angeles Unified School District, City Controller Laura Chick filed a public records request Tuesday with the school district, seeking audits, reports and studies for the past five years. In a letter to district Superintendent Roy Romer, Chick also demanded to know what steps the nation’s second-largest school district has taken to implement any audit recommendations. “I am extremely serious in my efforts to obtain a clear picture of what is happening in our school district,” Chick said. “LAUSD officials have publicly stated that my independent audit is not needed, that they have been audited enough. “So I am making a Public Records Act request to not only obtain the various audits and reports, but to see what actions have been taken to implement the various recommendations.”
Another Donegal post office is set to close its doors for good at the end of this month.Kilcar Post Office is to join a raft of other postal outlets to close their doors across the county this year.Deputy Pat the Cope Gallagher has expressed his shock at the decision by An Post. He said “Kilcar Post Offices effectively serves the entire Parish of Kilcar, as there is no other Post Office within the district. Not for the first time have, I stated that there is no strategic thinking in how An Post are permanently closing these Post Offices; this post Office is critical to the overall services provided for the town of Kilcar.”Based in the precedent of previous appeals which were lodged with An Post regarding the numerous other post office closures, he added that it seems to be a pointless exercise but the community of Kilcar must make every effort in order to save their post office stated Pat the CopeHe added “The ultimate decision that has allowed these post office closures was signed off on by this present Government; and it certainly is doing untold damage to rural Ireland. The loss of a post office to any town or village has such a massive devastating effect on that area.“The Government continues in its failing to fully understand what it is like for rural communities, it is the most anti-rural Government in the history of this state and this is another example of those disastrous policies that is ultimately shutting down rural Ireland step by step.” Shock as another Donegal post office to close at end of month was last modified: June 13th, 2019 by StephenShare this:Click to share on Facebook (Opens in new window)Click to share on Twitter (Opens in new window)Click to share on LinkedIn (Opens in new window)Click to share on Reddit (Opens in new window)Click to share on Pocket (Opens in new window)Click to share on Telegram (Opens in new window)Click to share on WhatsApp (Opens in new window)Click to share on Skype (Opens in new window)Click to print (Opens in new window) Tags:closuredonegalkilcarPat The Cope Gallagherpost office
Klay Thompson subscribes. You can too for just 11 cents a day for 11 months + receive a free Warriors Championship book. Sign me up!OAKLAND — Warriors forward Andre Iguodala sat out of Sunday’s game against the Los Angeles Clippers at Oracle Arena.Iguoudala has missed his third consecutive game because of left toe soreness, while Shaun Livingston initially was considered questionable after missing the second half of Friday’s game against Cleveland because of a left knee contusion. Warriors …
ALAMEDA — Gareon Conley admitted to some concern while laying on the sideline after being inadvertently hit in the head with the thigh of teammate Johnathan Abram.Yet by the time medical personnel had removed Conley’s facemask, strapped him to a cervical board were taking him from the Coliseum on a stretcher, the cornerback’s concern had shifted to his family.“I don’t want my family to think something is more than what it is,” Conley said Friday as the Raiders wrapped up preparations to …
In new developments in the search for MH370, search co-ordinator and retired defence chief Air Chief Marshal Angus Houston has confirmed that the Bluefin 21 remote underwater vehicle will be released to search for the black boxes on the ocean floor.Last Tuesday (April 8th) was the last time a ping was heard from the possible black box and given it has been 38 days since the crash, it is likely that the batteries on the black boxes have run out. As a result of this, the Ocean Shield will stop towing the pinger locater today.Angus Houston reports the 4 signals acquired early last week give the most promising lead in the search for MH370. “The four pings that were heard by Ocean Shield early last week have helped refine and reduce the search zone into a manageable area” he stated. Given that the batteries on the black box locaters have likely expired coupled with a refined search area, it is now appropriate to send down the Bluefin 21 to search and map the ocean floor. The depth of the search area is 4500m which is at the limit of the Bluefin’s capabilities. Whilst there are other vehicles that can go deeper these will only be sourced should they be needed.Mr Houston stresses that this is a slow and painstaking process. Over a 24 hour period it will take two hours for the Bluefin to reach the ocean floor and another two hours to return to the surface. The vehicle will search the floor for 16 hours and it will take approximately four hours for the data to be downloaded once it is back on the surface.The first search zone will be 5kms x 8kms giving a total search area of 40 square kilometres.In additional news, two litres of an oil slick have been collected from the search area and are en route to a lab for identification. Mr Houston again stressed that whilst the oil slick found is within the search zone there is no guarantee that it is from the missing plane. Related Stories– Only one fragment of debris needed to find plane – – Human input the only possible cause of the crash –
12 October 2015The United Nations has launched the Nelson Mandela Rules, a guideline to protect the rights of detainees.Secretary-general Ban Ki-moon welcomed the Revised Standard Minimum Rules for the Treatment of Prisoners, describing it as “a great step forward”, on 7 October.The United Nations Commission on Crime Prevention and Criminal Justice adopted the crucial revisions of the 60-year-old international standards on treatment of prisoners at a meeting on 22 May in Vienna, in Austria. Now the Mandela Rules have been adopted by the UN General Assembly, which has published them.The @UN has launched “#NelsonMandela Rules’ on improving treatment of prisoners http://t.co/umAMhDuzo8pic.twitter.com/x75snGO5DL— NelsonMandela (@NelsonMandela)October 9, 2015UN General Assembly president Mogens Lykketoft recalled the spirit of Mandela. “It is said that no one truly knows a nation until one has been inside its jails,” he quoted. “A nation should not be judged by how it treats its highest citizens, but its lowest ones.”Lykketoft said that nations had failed to protect the human rights of prisoners. Too often, the driving principle behind prisoner treatment had been to see these individuals as entirely separate from communities and societies.“Hidden from our gaze, and indeed sometimes before our very eyes, prisoners have suffered abuse and mistreatment.”The basic outlineThe Mandela Rules “outlines that there shall be no discrimination; that the religious beliefs and moral precepts of prisoners shall be respected; and that legal representation and protection are mandated in regard to vulnerable groups within the prison populations”, reads the UN website.Ivan Šimonović, assistant secretary-general for human rights, said the revised rules were much more specific on matters such as defining the scope on solitary confinement and first-time guidance on intrusive searches, including strip and body cavity searches.But implementation could be a challenge, said Lykketoft. “The crucial challenge for member states will be to translate these rules into a reality and to increase co-operation both within and outside the UN system to improve the lives of prisoners throughout the world.”Šimonović added: “That is what Mr Mandela would have expected from us.”South Africa chaired the expert group in the revision of the Standard Minimum Rules.The Mandela Rules now contain an expanded section of basic principles, including the absolute prohibition of torture and other cruel, inhuman or degrading treatment or punishment. The independence of health care staff is assured, and extensive restrictions are placed on disciplinary measures, including the prohibition of solitary confinement beyond 15 days.Clear and detailed instructions are provided on issues such as cell and body searches, registration and record keeping, investigations into deaths and complaints of torture and other ill-treatment, the needs of specific groups, independent inspections of prisons, the right to legal representation and more.Source: United Nations
Share Facebook Twitter Google + LinkedIn Pinterest By Leisa Boley Hellwarth, a dairy farmer and attorney near CelinaThe very first real estate closing I attended for a client was over 25 years ago. I was fresh out of law school, and a friend of mine, who was a cosmetics sales associate at Marshall Field’s, was purchasing her first house. It was what lawyers refer to as a “roundtable closing” which meant that the buyer, the seller, their realtors and their lawyers all met around a conference table and signed documents and transferred payment. This event is forever embedded in my memory because, as a new lawyer, I proudly showed up in a nice dark suit. And literally every other person, which was notable because it was all female, was wearing a floral dress and lots of fragrance. Just thinking about it makes my sinuses hurt.The closing went off without a hitch. But less than 24 hours later, my client was mad, really mad. All because the seller took the fireplace tools. I can assure you these fireplace tools were not anything special, but my client demanded representation on this issue, and I did my job. Fortunately, the sales contract that I had reviewed specifically mentioned that the fireplace tools remained with the house. So within 48 hours, the buyer was happy again, and the seller was without the aforementioned fireplace tools.The moral of this story is that the most important document in a closing is the sales contract that spells out specific requests of the buyer and the seller, provides a timetable, and governs the transaction. Most realtors do a wonderful job with sales contracts that are usually forms approved by the local realtors and attorneys. It does matter what is stipulated in this document, so please pay close attention that every part of the deal is included in the contract in writing. If not, chances are that provision will not be enforceable.With purchases involving farmland, it is important to consider if crops are growing on the land being sold. If so, the purchase agreement (another term for the sales contract) should stipulate that the current owner or tenant has the right to harvest the growing crops.The biggest difference between real estate closings in Franklin County and those in my home rural area of Mercer County is title insurance. When I practiced in Columbus, the seller purchased title insurance which guaranteed clear title to the buyer. In Mercer County, the seller usually promises clear title in the sales contract, but it is up to the buyer to do his due diligence which means he typically hires a local lawyer to due a title examination and offer an opinion on the title. The net effect is the same. This is a very important step in the process. On more than one occasion, I have done title research that led me to tell the buyer he would be better off finding a different property because of all of the issues that were uncovered when examining all of the documents filed on the property at the courthouse.In Ohio, property taxes are paid one year in arrears. Typically, at closing, the seller pays a pro-rated tax bill to cover his portion of the property tax, so the future tax bills will be the responsibility of the buyer.The seller pays for the drafting of the deed as well as the conveyance fee on the property. In Mercer County, that is $3.50 for every $1,000 in the sales price. The buyer pays all recording fees which include a .50-cent per parcel transfer fee at the auditor’s office and recording fees at the Recorder’s Office. In Mercer County, that is $28 for the first two pages, and $8 per page thereafter. And there is a $20 fee if your documents do not conform to the template at the recorder’s office.Obviously, these charges vary from county to county in the state. I just included what my local clients deal with as an example of the process.I just held a “roundtable closing” a few days ago. Times have changed. The sellers were three females, and no one wore a dress or excessive fragrance. I left my dark suits in Columbus, so I had on jeans and boots. And all parties departed the closing happy because a fair price was negotiated between the buyer and the seller. All the legal documents in the world cannot create that kind of harmony.
Related Posts By the time you read this today, the BP/Transocean/Halliburton oil hemorrhage may finally be on its way to a resolution. Or it may still be burbling away, happily coating wildlife, habitat and the region’s tourism and fishing industries with a viscous sheen of Game Over.A lot of us have taken to our networks to fulminate over this without a lot of focus or hope of affecting things – me included. Of course, sometimes you just have to vent (as a certain large, gaping opening in a BP oil pipe could tell you). And raising awareness is a Good Thing.But some folks are taking it beyond just a few retweets, and using online tools to genuinely contribute to our understanding of the disaster. Take Paul Rademacher’s use of Google Earth to map the extent of the oil spill onto any location on Earth – say, your own hometown – and gain a sense of the geographical scope of the situation. (It’s possible, in turn, because of Google’s impressive crisis response page for the spill, which has a collection of mapping layers and resources.)Or look at Oil Reporter, an open-source app for the iPhone and Android that lets ordinary people log individual instances of oil spill impacts they discover – crowdsourcing the documentation of the spill’s effects, Ushahidi-style. It’s created by the good people at CrisisCommons, which has partnered with the San Diego State University Visualization Center to manage the data collected through the app – which is available through an open API.That hasn’t meant one less drop of oil has come out of that pipe. But these two initiatives, and others like them, can help buttress support for spending the billions that will be needed to do what can be done to clean up the aftermath, and in the case of Oil Reporter, help point out places some of those resources should go.And with any luck, it could spur some people like me who’ve confined our activism to subscribing to the BPGlobalPR Twitter feed (which is often funny as all hell) to do something a little more meaningful. rob cottingham 5 Outdoor Activities for Beating Office Burnout Tags:#Cartoons#web 12 Unique Gifts for the Hard-to-Shop-for People… 9 Books That Make Perfect Gifts for Industry Ex… 4 Keys to a Kid-Safe App More Noise to Signal.
Matt Asay Serverless Backups: Viable Data Protection for … Cloud Hosting for WordPress: Why Everyone is Mo… Related Posts Tags:#data center#SDN#software defined data center#software defined networking#storage#VMware Top Reasons to Go With Managed WordPress Hosting How Intelligent Data Addresses the Chasm in Cloud Today the race is on to virtualize all aspects of the data center. Dubbed the software-defined data center (SDDC) or sometimes software defined networking (SDN), SDDC is a market IDC projects will top $3.7 billion by 2016.It’s a hot market, too: just this week, Cisco, IBM, VMware, Red Hat and others have banded together under a Linux Foundation-hosted consortium called OpenDaylight. But while this is a significant step toward virtualizing the networking layer of the data center, it may simply be a prelude to the next phase of virtualization: storage.VMware led the way in virtualizing servers in the data center, creating enormous value for its shareholders over the last decade. Originally acquired by EMC for $635 million in 2003, VMware is now a standalone company with a market capitalization of more than $30 billion. Last year it acquired a leading SDN startup, Nicira, for nearly $1.3 billion. That move scared a lot of data center vendors – primarily Cisco – who don’t want to see VMware dominate networking virtualization as completely as it came to own server virtualization.Too often overlooked in all the billions of dollars sloshing around servers and networking competition in SDDC is the laggard, storage. Traditional storage is a $10 billion annual business, but until recently it hasn’t made much headway into virtualization.That may be about to change. To better understand the trends shaping the rise of the software-defined storage play, I sat down recently with Dr. Kieran Harty, CEO of Tintri, makers of storage systems for software defined data centers, and one of a core virtualization pioneer. Harty ran engineering at VMware from 1999 to 2006 and his teams created the software products that virtualized the server side of the SDDC equation.ReadWrite: Remind us again what VMware was trying to do a dozen years ago when your teams were focused on bringing virtualization to servers.Harty: The basic problems virtualization solved back then we called server consolidation and over-provisioning. Business wanted to move compute workloads from large, costly, proprietary, single servers (usually Sun servers) running one application, oftentimes at only 10% of capacity, to clusters of cheap, commodity, Linux servers. VMware pioneered a technology called the hypervisor that allowed virtualization to make this possible – on the server.ReadWrite: Today VMware enjoys roughly 90% market share in server virtualization. The spectacular success of server virtualization begs the big question of what comes next. Can the same benefits of virtualization on servers be applied to the rest of the data center?Harty: This is what gives rise to the concept of the software-defined data center (SDDC) – a data center with infrastructure that is fundamentally more flexible, automated and cost-effective; infrastructures that understand application workloads and can automatically and efficiently allocate pooled resources to match the application demands. Rather than construct data centers full of over-provisioned and siloed resources, a SDDC would more efficiently utilize and share all aspects of the infrastructure: servers, networking and storage.While servers, and to a lesser extent networks, have embraced SDDC, storage lags significantly behind and continues to cause a great deal of pain in the data center today. Fortunately, some of the key technologies that brought the sweeping changes to servers and networks are taking shape for storage.ReadWrite: What kind of changes?Harty: A quick look at some of the most successful disruptive technologies reveals that many of them “crossed the chasm” with the help of a few common key ingredients: standardization, hardware innovation and abstraction. In the case of server virtualization, the standardization of Intel’s x86 platform and the proliferation of the open source Linux operating system massively disrupted the server market. Armed with a new generation of multi-core processors and VMware’s hypervisor technology, server virtualization conquered the data center. Networks followed a similar path starting with TCP/IP standardizing the network protocol. Gigabit Ethernet increased transmission speed by an order of magnitude. OpenFlow, which set the foundation of an open and standards-based software-defined networking, paved the way for the most significant changes in networks in several decades. ReadWrite: What kinds of changes in standards, hardware innovation and abstraction are leading to disruption in the storage market?Harty: For 20 years, little has changed in the world of legacy storage designed for physical environments. As data centers become more virtualized, there is a growing gap due to the complete mismatch between how storage systems were designed and the demands of virtual environments. It’s a bit like people who don’t speak the same languages and have a hard time understanding each other – storage speaks LUNs and volumes; servers speak VMs. As a result, they don’t understand each other very well. Storage allocation, management and performance troubleshooting for the virtualized infrastructure are difficult, if not impossible with legacy storage. Companies have tried to work around this obstacle by over-provisioning storage which is very expensive and increases complexity.ReadWrite: Is there where flash technology enters and disrupts storage? Can we power through these legacy storage challenges with performance improvements that are an order of magnitude over those of traditional spinning disk?Harty: Storage has always been about performance and data management. Flash removes the performance challenges and levels the competitive playing field for storage vendors. Flash enables very dense storage systems that can host thousands of VMs in just a few rack units of space. But flash by itself – without the intelligence – only gets us so far. And while some industry players are attempting to make virtualization products adapt to legacy storage through APIs, or retrofit legacy storage to become virtualization aware, neither goes far enough to bridge the yawning gap between these two mismatched technologies – you can put lipstick on a pig, but it’s still a pig. What is needed to solve this problem is storage that has been completely redefined to operate in the virtual environment and uses the constructs of virtualization. In short, VM-aware storage.ReadWrite: What do you mean, VM-aware?Harty: Virtualized environments require storage designed for virtualization. Enterprises expecting to get the full benefit out of the software-defined data center need storage that’s simple and agile to manage, while delivering the performance required by modern applications. They will need storage that understands the IO patterns of virtual environments and that automatically manages quality of service (QoS) for each VM. We eliminate an entire layer of unnecessary complexity if we stop talking about LUNs or volumes. The broad adoption of virtual machines as the data center lingua franca gives us de facto standardization for software-defined storage. The rapid growth and declining cost of flash technology provides the hardware innovation. This leaves us with the one last essential missing piece – the abstraction between storage and VMs, an abstraction that understands VMs while being able to abstract and pool the underlying storage resources and deliver the benefits of simple, high performing and cost effective storage. We call that VM-aware storage.Image courtesy of Shutterstock.